Re: My Experiment of PG crash when dealing with huge amount of data

Поиск
Список
Период
Сортировка
От Jeff Janes
Тема Re: My Experiment of PG crash when dealing with huge amount of data
Дата
Msg-id CAMkU=1xDZ-yaK+mzLweqHL1wNCf0MTgh+Vpe1SwmGzzgicBXBQ@mail.gmail.com
обсуждение исходный текст
Ответ на My Experiment of PG crash when dealing with huge amount of data  (高健 <luckyjackgao@gmail.com>)
Ответы Re: My Experiment of PG crash when dealing with huge amount of data  (高健 <luckyjackgao@gmail.com>)
Список pgsql-general
On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao@gmail.com> wrote:
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));

The construct "values (srf1,srf2)" will generate its entire result set
in memory up front, it will not "stream" its results to the insert
statement on the fly.

To spare memory, you would want to use something like:

insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);

Cheers,

Jeff


В списке pgsql-general по дате отправления:

Предыдущее
От: Kevin Grittner
Дата:
Сообщение: Re: SSI and predicate locks - a non-trivial use case
Следующее
От: Jeff Davis
Дата:
Сообщение: Re: Dump/Reload pg_statistic to cut time from pg_upgrade?