Обсуждение: out of memory for query result
I have a Postgis table with about 2 million polyline records. The most number of points I have in the geometry field is about 500. I have a simple DBD::Pg Perl program that does a select for most of these records and do some processing with them before writing them to a file. Unfortunately, I seem to keep getting this error: DBD::Pg::st execute failed: out of memory for query result DBD::Pg::st fetchrow_array failed: no statement executing This program works fine with less than a million records for sure, my next largest table. I believe the program is failing at the first execute after prepare on this table. Any ideas on how I can do this? Thanks, Tim
On Fri, Apr 29, 2005 at 10:47:36AM -0500, ttsai@pobox.com wrote: > > DBD::Pg::st execute failed: out of memory for query result Have you considered using a cursor to fetch the query results? That should prevent the API from trying to load the entire result set into memory. -- Michael Fuhr http://www.fuhr.org/~mfuhr/
On Fri, Apr 29, 2005 at 10:58:08AM -0600, Michael Fuhr wrote: > Have you considered using a cursor to fetch the query results? That > should prevent the API from trying to load the entire result set > into memory. I can do that. I didn't know the API would try to load the entire result set into memory first though. yikes! Tim