splitting data into multiple tables

Поиск
Список
Период
Сортировка
Hello,

          I am working on a project that will take out structured content from wikipedia
and put it in our database. Before putting the data into the database I wrote a script to
find out the number of rows every table would be having after the data is in and I found
there is a table which will approximately have 5 crore entries after data harvesting.
Is it advisable to keep so much data in one table ?
          I have read about 'partitioning' a table. An other idea I have is to break the table into
different tables after the no of rows  in a table has reached a certain limit say 10 lacs.
For example, dividing a table 'datatable' to 'datatable_a', 'datatable_b' each having 10 lac entries.
I needed advice on whether I should go for partitioning or the approach I have thought of.
          We have a HP server with 32GB ram,16 processors. The storage has 24TB diskspace (1TB/HD).
We have put them on RAID-5. It will be great if we could know the parameters that can be changed in the
postgres configuration file so that the database makes maximum utilization of the server we have.
For eg parameters that would increase the speed of inserts and selects.


Thank you in advance
Rajiv Nair

В списке pgsql-performance по дате отправления:

Предыдущее
От: "fkater@googlemail.com"
Дата:
Сообщение: Re: Inserting 8MB bytea: just 25% of disk perf used?
Следующее
От: Amitabh Kant
Дата:
Сообщение: Re: splitting data into multiple tables