Re: large numbers of inserts out of memory strategy

Поиск
Список
Период
Сортировка
От Ted Toth
Тема Re: large numbers of inserts out of memory strategy
Дата
Msg-id CAFPpqQGJ9GkntRrjEqvHCRtkbHojmTy93LmCpfuMGJj5F2O7-A@mail.gmail.com
обсуждение исходный текст
Ответ на Re: large numbers of inserts out of memory strategy  (Tomas Vondra <tomas.vondra@2ndquadrant.com>)
Ответы Re: large numbers of inserts out of memory strategy  (Tomas Vondra <tomas.vondra@2ndquadrant.com>)
Список pgsql-general
On Tue, Nov 28, 2017 at 12:01 PM, Tomas Vondra
<tomas.vondra@2ndquadrant.com> wrote:
>
>
> On 11/28/2017 06:54 PM, Ted Toth wrote:
>> On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra
>> <tomas.vondra@2ndquadrant.com> wrote:
>>> Hi,
>>>
>>> On 11/28/2017 06:17 PM, Ted Toth wrote:
>>>> I'm writing a migration utility to move data from non-rdbms data
>>>> source to a postgres db. Currently I'm generating SQL INSERT
>>>> statements involving 6 related tables for each 'thing'. With 100k or
>>>> more 'things' to migrate I'm generating a lot of statements and when I
>>>> try to import using psql postgres fails with 'out of memory' when
>>>> running on a Linux VM with 4G of memory. If I break into smaller
>>>> chunks say ~50K statements then thde import succeeds. I can change my
>>>> migration utility to generate multiple files each with a limited
>>>> number of INSERTs to get around this issue but maybe there's
>>>> another/better way?
>>>>
>>>
>>> The question is what exactly runs out of memory, and how did you modify
>>> the configuration (particularly related to memory).
>>>
>>> regards
>>>
>>> --
>>> Tomas Vondra                  http://www.2ndQuadrant.com
>>> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
>>
>> I'm pretty new to postgres so I haven't changed any configuration
>> setting and the log is a bit hard for me to make sense of :(
>>
>
> The most interesting part of the log is this:
>
>     SPI Proc: 2464408024 total in 279 blocks; 1672 free (1 chunks);
> 2464406352 used
>       PL/pgSQL function context: 537911352 total in 74 blocks; 2387536
> free (4 chunks); 535523816 used
>
>
> That is, most of the memory is allocated for SPI (2.4GB) and PL/pgSQL
> procedure (500MB). How do you do the load? What libraries/drivers?
>
> regards
>
> --
> Tomas Vondra                  http://www.2ndQuadrant.com
> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services

I'm doing the load with 'psql -f'. I using 9.6 el6 rpms on a Centos VM
I downloaded from the postgres repo.


В списке pgsql-general по дате отправления:

Предыдущее
От: Ted Toth
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy
Следующее
От: Tomas Vondra
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy