Re: Streaming large data into postgres [WORM like applications]

Поиск
Список
Период
Сортировка
От Lincoln Yeoh
Тема Re: Streaming large data into postgres [WORM like applications]
Дата
Msg-id 200705121530.l4CFUJuq032048@smtp2.jaring.my
обсуждение исходный текст
Ответ на Streaming large data into postgres [WORM like applications]  ("Dhaval Shah" <dhaval.shah.m@gmail.com>)
Ответы Re: Streaming large data into postgres [WORM like applications]  ("Dhaval Shah" <dhaval.shah.m@gmail.com>)
Список pgsql-general
At 04:43 AM 5/12/2007, Dhaval Shah wrote:

>1. Large amount of streamed rows. In the order of @50-100k rows per
>second. I was thinking that the rows can be stored into a file and the
>file then copied into a temp table using copy and then appending those
>rows to the master table. And then dropping and recreating the index
>very lazily [during the first query hit or something like that]

Is it one process inserting or can it be many processes?

Is it just a short (relatively) high burst or is that rate sustained
for a long time? If it's sustained I don't see the point of doing so
many copies.

How many bytes per row? If the rate is sustained and the rows are big
then you are going to need LOTs of disks (e.g. a large RAID10).

When do you need to do the reads, and how up to date do they need to be?

Regards,
Link.




В списке pgsql-general по дате отправления:

Предыдущее
От: Tom Allison
Дата:
Сообщение: Re: stuck on values in 8.2
Следующее
От: Richard Broersma Jr
Дата:
Сообщение: Re: stuck on values in 8.2