Yuri Budilov <yuri.budilov@hotmail.com> writes:
> Posted on Stack Overflow, sadly no replies, so trying here....
> CREATE TABLE X AS
> SELECT json_array_elements(json_rmq -> 'orders'::text) AS order
> FROM table_name
> WHERE blah;
> I get out of memory error.
> The JSON column is about ~5 MB and it has about ~150,000 array row elements in 'orders' above.
I tried to reproduce that, and couldn't, given the available info.
I made a JSON value of more or less that size with
perl -e 'print "{\"orders\": [0"; for($i=1;$i<=150000;$i++){print ",$i"}; print "]}\n";' >jsonval
and then did
regression=# create table table_name(json_rmq json);
CREATE TABLE
regression=# \copy table_name from jsonval
COPY 1
regression=# insert into table_name select * from table_name;
INSERT 0 1
regression=# insert into table_name select * from table_name;
INSERT 0 2
regression=# insert into table_name select * from table_name;
INSERT 0 4
regression=# insert into table_name select * from table_name;
INSERT 0 8
regression=# insert into table_name select * from table_name;
INSERT 0 16
regression=# insert into table_name select * from table_name;
INSERT 0 32
regression=# insert into table_name select * from table_name;
INSERT 0 64
regression=# insert into table_name select * from table_name;
INSERT 0 128
regression=# CREATE TABLE X AS
SELECT json_array_elements(json_rmq -> 'orders'::text) AS order
FROM table_name;
SELECT 38400256
Watching the process with "top", its memory consumption stayed
rock-steady. If there's a leak in there, this example doesn't
show it. There could be a leak related to some detail you
failed to mention, but ...
regards, tom lane