I have multi tables with over 10,000, 000 rows, the biggest one is 70, 000, 000 rows. For each table, there are several indexes, almost all columns are varchar2. In my experiences, with many indexes on large table, data insertion will be a pain. In my case, I have 30, 000 rows to be inserted to a table every day, it takes hours for each table, if I drop indexes, insertion speeds up, but recreate such indexes takes 7 hours. Usually querying data is not the problem, but I think you must consider the performance if you have to insert, update data frequently like me.
Hope it helps!
Anna Zhang
I have a set of data that will compose a table with 32 million rows. I currently run postgresql with tables as large as 750,000 rows.
Does anyone have experience with such large tables data. In addition, I have been reading information on moving postgresql tables to
another hard-drive can anyone advise me.
Thanks
Do You Yahoo!?
Yahoo! Movies - coverage of the 74th Academy Awards®