Re: Copying large tables with DBLink

Поиск
Список
Период
Сортировка
От Michael Fuhr
Тема Re: Copying large tables with DBLink
Дата
Msg-id 20050324195059.GA13146@winnie.fuhr.org
обсуждение исходный текст
Ответ на Copying large tables with DBLink  ("Chris Hoover" <revoohc@sermonaudio.com>)
Список pgsql-admin
On Thu, Mar 24, 2005 at 01:59:44PM -0500, Chris Hoover wrote:
>
> Has anyone had problems with memory exhaustion and dblink?  We were
> trying to use dblink to convert our databases to our new layout, and had
> our test server lock up several times when trying to copy a table that
> was significantly larger than our memory and swap.

Hmmm...doesn't dblink use libpq, and doesn't libpq fetch the entire
result set before doing anything with it?  If so, then that could
explain the memory exhaustion.

> Basically where were doing an insert into <table> select * from
> dblink('dbname=olddb','select * from large_table) as t_large_table(table
> column listing);
>
> Does anyone know of a way around this?

How about using pg_dump to dump the original table and restore it
into the new table?  If you just want the table's contents without
the table definition then you could use the -a (--data-only) option.

Another possibility would be to write a function that uses a cursor:
dblink_open() and a loop that calls dblink_fetch() until you reach
the end of the result set.  I think that wouldn't have a memory
exhaustion problem (but test it to be sure).

--
Michael Fuhr
http://www.fuhr.org/~mfuhr/

В списке pgsql-admin по дате отправления:

Предыдущее
От: Tom Lane
Дата:
Сообщение: Re: Copying large tables with DBLink
Следующее
От: Kris Kiger
Дата:
Сообщение: Very worried about this