Re: My Experiment of PG crash when dealing with huge amount of data

Поиск
Список
Период
Сортировка
От 高健
Тема Re: My Experiment of PG crash when dealing with huge amount of data
Дата
Msg-id CAL454F0wXbvXSmiV7qm0dvGtArbR0jp2yrgjMi1uzqm8AE0eig@mail.gmail.com
обсуждение исходный текст
Ответ на Re: My Experiment of PG crash when dealing with huge amount of data  (Jeff Janes <jeff.janes@gmail.com>)
Ответы Re: My Experiment of PG crash when dealing with huge amount of data  (Tom Lane <tgl@sss.pgh.pa.us>)
Re: My Experiment of PG crash when dealing with huge amount of data  (Jeff Janes <jeff.janes@gmail.com>)
Список pgsql-general
>To spare memory, you would want to use something like:

>insert into test01 select generate_series,
>repeat(chr(int4(random()*26)+65),1024) from
>generate_series(1,2457600);

Thanks a lot!

What I am worrying about is that:  
If data grows rapidly, maybe our customer will use too much memory , Is ulimit  command a good idea for PG?

Best Regards



2013/9/1 Jeff Janes <jeff.janes@gmail.com>
On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao@gmail.com> wrote:
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));

The construct "values (srf1,srf2)" will generate its entire result set
in memory up front, it will not "stream" its results to the insert
statement on the fly.

To spare memory, you would want to use something like:

insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);

Cheers,

Jeff

В списке pgsql-general по дате отправления:

Предыдущее
От: Adrian Klaver
Дата:
Сообщение: Re: store multiple rows with the SELECT INTO statement
Следующее
От: Tom Lane
Дата:
Сообщение: Re: My Experiment of PG crash when dealing with huge amount of data