Re: large numbers of inserts out of memory strategy

Поиск
Список
Период
Сортировка
От Rob Sargent
Тема Re: large numbers of inserts out of memory strategy
Дата
Msg-id 5570090c-885b-73ad-529d-f217fd197b2b@gmail.com
обсуждение исходный текст
Ответ на Re: large numbers of inserts out of memory strategy  (Ted Toth <txtoth@gmail.com>)
Ответы Re: large numbers of inserts out of memory strategy  (Rory Campbell-Lange <rory@campbell-lange.net>)
Список pgsql-general
On 11/28/2017 10:50 AM, Ted Toth wrote:
> On Tue, Nov 28, 2017 at 11:19 AM, Rob Sargent <robjsargent@gmail.com> wrote:
>>> On Nov 28, 2017, at 10:17 AM, Ted Toth <txtoth@gmail.com> wrote:
>>>
>>> I'm writing a migration utility to move data from non-rdbms data
>>> source to a postgres db. Currently I'm generating SQL INSERT
>>> statements involving 6 related tables for each 'thing'. With 100k or
>>> more 'things' to migrate I'm generating a lot of statements and when I
>>> try to import using psql postgres fails with 'out of memory' when
>>> running on a Linux VM with 4G of memory. If I break into smaller
>>> chunks say ~50K statements then thde import succeeds. I can change my
>>> migration utility to generate multiple files each with a limited
>>> number of INSERTs to get around this issue but maybe there's
>>> another/better way?
>>>
>>> Ted
>>>
>> what tools / languages ate you using?
> I'm using python to read binary source files and create the text files
> contains the SQL. Them I'm running psql -f <file containing SQL>.
If you're going out to the file system, I would use COPY of csv files 
(if number of records per table is non-trivial).  Any bulk loading 
python available?



В списке pgsql-general по дате отправления:

Предыдущее
От: Ted Toth
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy
Следующее
От: Tomas Vondra
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy