Re: Large TEXT fields
| От | Joe Conway |
|---|---|
| Тема | Re: Large TEXT fields |
| Дата | |
| Msg-id | 3D5D3121.30503@joeconway.com обсуждение исходный текст |
| Ответ на | Large TEXT fields ("David F. Skoll" <dfs@roaringpenguin.com>) |
| Список | pgsql-admin |
David F. Skoll wrote:
> I'm considering building a database with a table which stores entire
> e-mail messages, up to around 128kB.
>
> - How does PostgreSQL cope with tables with huge amounts of data in a
> TEXT column? Let's say 50,000 rows of e-mail messages with an average
> e-mail size of 20kB, with maybe 5% of the messages over 100kB.
What you describe is not what I would call huge, but a lot depends on
your hardware and typical queries of course. Why don't you test it?
As am example, I did this (on a Pentium 833, 512 MB RAM, IDE hard drive,
Red Hat 7.3)
1. create table bigtest(f1 int, f2 text);
2. run script to insert 50000 rows of sequential id, and 130k string
literal (took about 24 minutes)
3. create index bigtest_idx on bigtest(f1); (took about 2 seconds)
4. analyze bigtest; (took about 2 seconds)
5. explain analyze select f1, substr(f2,1,20)
from bigtest where f1 = 33333;
explain analyze select f1, substr(f2,1,20) from bigtest where f1 = 33333;
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------
Index Scan using bigtest_idx on bigtest (cost=0.00..3.01 rows=1
width=36) (actual time=1.66..1.69 rows=1 loops=1)
Index Cond: (f1 = 33333)
Total runtime: 1.83 msec
(3 rows)
HTH,
Joe
В списке pgsql-admin по дате отправления: