Re: Best practice to load a huge table from ORACLE to PG

Поиск
Список
Период
Сортировка
От Tino Wildenhain
Тема Re: Best practice to load a huge table from ORACLE to PG
Дата
Msg-id 48162C86.1040405@wildenhain.de
обсуждение исходный текст
Ответ на Best practice to load a huge table from ORACLE to PG  ("Adonias Malosso" <malosso@gmail.com>)
Список pgsql-performance
Adonias Malosso wrote:
> Hi All,
>
> I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
> columns table
> from ORACLE to PGSQL.
>
> The current approach is to dump the data in CSV and than COPY it to
> Postgresql.
>
Uhm. 101 columns you say? Sounds interesting. There are dataloaders
like: http://pgfoundry.org/projects/pgloader/  which could speed
up loading the data over just copy csv. I wonder how much normalizing
could help.

Tino

В списке pgsql-performance по дате отправления:

Предыдущее
От: Shane Ambler
Дата:
Сообщение: Re: Very poor performance loading 100M of sql data using copy
Следующее
От: "Joshua D. Drake"
Дата:
Сообщение: Re: Benchmarks WAS: Sun Talks about MySQL