Re: large numbers of inserts out of memory strategy

Поиск
Список
Период
Сортировка
От Steven Lembark
Тема Re: large numbers of inserts out of memory strategy
Дата
Msg-id 20171201114942.177b394c@wrkhors.com
обсуждение исходный текст
Ответ на Re: large numbers of inserts out of memory strategy  (Ted Toth <txtoth@gmail.com>)
Список pgsql-general
On Thu, 30 Nov 2017 08:43:32 -0600
Ted Toth <txtoth@gmail.com> wrote:

> What is the downside of using a DO block? I'd have to do a nextval on
> each sequence before I could use currval, right? Or I could do 'select
> last_value from <sequence>'.

You are creating a piece of code that has to be parsed, tokenized,
and compiled prior to execution. What's biting you is that you've
created a function the size of your dataset.

If you like do-blocks then write a short block to insert one record
using placeholders and call it a few zillion times.

That or (in DBI-speak):


    eval
    {
        $dbh->{ RaiseError  } = 1;
        $dbh->{ AutoCommit  } = 0;

        my $sth = $dbh->prepare
        (
            'insert into yourtable ( field field ) values ( $1, $2 )'
        );

        $sth->do( @$_ ) for @rows;

        $dbh->commit
    }
    or die "Failed execution: $@";

which will be nearly as effecient in the long run.

That or just import the data from a csv/tsv (there are good 
examples of data import available in the PG docs).

-- 
Steven Lembark                                       1505 National Ave
Workhorse Computing                                 Rockford, IL 61103
lembark@wrkhors.com                                    +1 888 359 3508


В списке pgsql-general по дате отправления:

Предыдущее
От: Ted Toth
Дата:
Сообщение: Re: large numbers of inserts out of memory strategy
Следующее
От: support-tiger
Дата:
Сообщение: pg data backup from vps