large numbers of inserts out of memory strategy

Поиск
Список
Период
Сортировка
I'm writing a migration utility to move data from non-rdbms data
source to a postgres db. Currently I'm generating SQL INSERT
statements involving 6 related tables for each 'thing'. With 100k or
more 'things' to migrate I'm generating a lot of statements and when I
try to import using psql postgres fails with 'out of memory' when
running on a Linux VM with 4G of memory. If I break into smaller
chunks say ~50K statements then thde import succeeds. I can change my
migration utility to generate multiple files each with a limited
number of INSERTs to get around this issue but maybe there's
another/better way?

Ted


В списке pgsql-general по дате отправления:

Предыдущее
От: Robert Haas
Дата:
Сообщение: Re: ERROR: too many dynamic shared memory segments
Следующее
От: Rob Sargent
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy