How Big is Too Big for Tables?

Поиск
Список
Период
Сортировка
От Bill Thoen
Тема How Big is Too Big for Tables?
Дата
Msg-id 4C506458.7010502@gisnet.com
обсуждение исходный текст
Ответы Re: How Big is Too Big for Tables?  (Vincenzo Romano <vincenzo.romano@notorand.it>)
Re: How Big is Too Big for Tables?  ("Joshua D. Drake" <jd@commandprompt.com>)
Re: How Big is Too Big for Tables?  (Alex Thurlow <alex@blastro.com>)
Re: How Big is Too Big for Tables?  (Terry Fielder <terry@ashtonwoodshomes.com>)
Re: How Big is Too Big for Tables?  ("Joshua D. Drake" <jd@commandprompt.com>)
Список pgsql-general
I'm building a national database of agricultural information and one of the layers is a bit more than a gigabyte per state. That's 1-2 million records per state, with a mult polygon geometry, and i've got about 40 states worth of data. I trying to store everything in a single PG table. What I'm concerned about is if I combine every state into one big table then will performance will be terrible, even with indexes? On the other hand, if I store the data in several smaller files, then if a user zooms in on a multi-state region,  I've got  to build or find a much more complicated way to query multiple files.

So I'm wondering, should I be concerned with building a single national size table (possibly 80-100 Gb) for all these records, or should I keep the files smaller and hope there's something like ogrtindex out there for PG tables? what do you all recommend in this case? I just moved over to Postgres to handle big files, but I don't know its limits. With a background working with MS Access and bitter memories of what happens when you get near Access'  two gigabyte database size limit, I'm a little nervous of these much bigger files. So I'd appreciate anyone's advice here.

TIA,
- Bill Thoen

В списке pgsql-general по дате отправления:

Предыдущее
От: Ranjeeth Nagarajan
Дата:
Сообщение: order in which rules are executed
Следующее
От: Vincenzo Romano
Дата:
Сообщение: Re: How Big is Too Big for Tables?