Re: best way to write large data-streams quickly?

Поиск
Список
Период
Сортировка
От Steve Atkins
Тема Re: best way to write large data-streams quickly?
Дата
Msg-id 1137E1BF-BEEE-4EB9-B284-2AC923C2A016@blighty.com
обсуждение исходный текст
Ответ на best way to write large data-streams quickly?  (Mark Moellering <markmoellering@psyberation.com>)
Ответы Re: best way to write large data-streams quickly?  (Mark Moellering <markmoellering@psyberation.com>)
Список pgsql-general
> On Apr 9, 2018, at 8:49 AM, Mark Moellering <markmoellering@psyberation.com> wrote:
>
> Everyone,
>
> We are trying to architect a new system, which will have to take several large datastreams (total of ~200,000 parsed
filesper second) and place them in a database.  I am trying to figure out the best way to import that sort of data into
Postgres.  
>
> I keep thinking i can't be the first to have this problem and there are common solutions but I can't find any.  Does
anyoneknow of some sort method, third party program, etc, that can accept data from a number of different sources, and
pushit into Postgres as fast as possible? 

Take a look at http://ossc-db.github.io/pg_bulkload/index.html. Check the benchmarks for different situations compared
toCOPY. 

Depending on what you're doing using custom code to parse your data and then do multiple binary COPYs in parallel may
bebetter. 

Cheers,
  Steve



В списке pgsql-general по дате отправления:

Предыдущее
От: Mark Moellering
Дата:
Сообщение: best way to write large data-streams quickly?
Следующее
От: Peter Eisentraut
Дата:
Сообщение: Re: List all columns referencing an FK