Re: Best practice to load a huge table from ORACLE to PG

Поиск
Список
Период
Сортировка
От Adonias Malosso
Тема Re: Best practice to load a huge table from ORACLE to PG
Дата
Msg-id 8a5d3c890804281437u6bd655b3k9a38a77a49dc5552@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Best practice to load a huge table from ORACLE to PG  ("Jonah H. Harris" <jonah.harris@gmail.com>)
Ответы Re: Best practice to load a huge table from ORACLE to PG
Список pgsql-performance
Jonah,
 
Thank you for the answer. Good to know about this enterprise DB feature.
 
I´ll follow using pgloader.
 
Regards.
 
Adonias Malosso

On Sat, Apr 26, 2008 at 10:14 PM, Jonah H. Harris <jonah.harris@gmail.com> wrote:
On Sat, Apr 26, 2008 at 9:25 AM, Adonias Malosso <malosso@gmail.com> wrote:
> I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
> columns table
> from ORACLE to PGSQL.

The fastest and easiest method would be to dump the data from Oracle
into CSV/delimited format using something like ociuldr
(http://www.anysql.net/en/ociuldr.html) and load it back into PG using
pg_bulkload (which is a helluva lot faster than COPY).  Of course, you
could try other things as well... such as setting up generic
connectivity to PG and inserting the data to a PG table over the
database link.

Similarly, while I hate to see shameless self-plugs in the community,
the *fastest* method you could use is dblink_ora_copy, contained in
EnterpriseDB's PG+ Advanced Server; it uses an optimized OCI
connection to COPY the data directly from Oracle into Postgres, which
also saves you the intermediate step of dumping the data.

--
Jonah H. Harris, Sr. Software Architect | phone: 732.331.1324
EnterpriseDB Corporation | fax: 732.331.1301
499 Thornall Street, 2nd Floor | jonah.harris@enterprisedb.com
Edison, NJ 08837 | http://www.enterprisedb.com/

В списке pgsql-performance по дате отправления:

Предыдущее
От: PFC
Дата:
Сообщение: Postgres Benchmark looking for maintainer
Следующее
От: Chris Browne
Дата:
Сообщение: Re: Replication Syatem