Re: Best practice to load a huge table from ORACLE to PG

Поиск
Список
Период
Сортировка
От Dimitri Fontaine
Тема Re: Best practice to load a huge table from ORACLE to PG
Дата
Msg-id 200804280949.40101.dfontaine@hi-media.com
обсуждение исходный текст
Ответ на Re: Best practice to load a huge table from ORACLE to PG  (Greg Smith <gsmith@gregsmith.com>)
Список pgsql-performance
Hi,

Le dimanche 27 avril 2008, Greg Smith a écrit :
> than SQL*PLUS.  Then on the PostgreSQL side, you could run multiple COPY
> sessions importing at once to read this data all back in, because COPY
> will bottleneck at the CPU level before the disks will if you've got
> reasonable storage hardware.

Latest pgloader version has been made to handle this exact case, so if you
want to take this route, please consider pgloader 2.3.0:
  http://pgloader.projects.postgresql.org/#_parallel_loading
  http://pgfoundry.org/projects/pgloader/

Another good reason to consider using pgloader is when the datafile contains
erroneous input lines and you don't want the COPY transaction to abort. Those
error lines will get rejected out by pgloader while the correct ones will get
COPYied in.

Regards,
--
dim

Вложения

В списке pgsql-performance по дате отправления:

Предыдущее
От: Vlad Arkhipov
Дата:
Сообщение: Simple JOIN problem
Следующее
От: Gregory Stark
Дата:
Сообщение: Re: [pgsql-advocacy] Benchmarks WAS: Sun Talks about MySQL