Looks like with 1.8 GB usage not much left for dump to get the required chunk from memory. Not sure if that will help but try increasing the swap space...
logfile content see http://www.rafb.net/paste/results/cvD7uk33.html - cat /proc/sys/kernel/shmmax is 2013265920 - ulimit is unlimited kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit postmaster process usage is 1.8gb ram atm
thx Thomas
Shoaib Mir schrieb: > Can you please show the dbserver logs and syslog at the same time when > it goes out of memory... > > Also how much is available RAM you have and the SHMMAX set? > > ------------ > Shoaib Mir > EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>) > > On 12/15/06, *Thomas Markus* <t.markus@proventis.net > <mailto: t.markus@proventis.net>> wrote: > > Hi, > > i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram) > pg_dump creates an error when exporting a large table with blobs > (largest blob is 180mb) > > error is: > pg_dump: ERROR: out of memory > DETAIL: Failed on request of size 1073741823. > pg_dump: SQL command to dump the contents of table "downloads" failed: > PQendcopy() failed. > pg_dump: Error message from server: ERROR: out of memory > DETAIL: Failed on request of size 1073741823. > pg_dump: The command was: COPY public.downloads ... TO stdout; > > if i try pg_dump with -d dump runs with all types (c,t,p), but i cant > restore (out of memory error or corrupt tar header at ...) > > how can i backup (and restore) such a db? > > kr > Thomas > > ---------------------------(end of > broadcast)--------------------------- > TIP 4: Have you searched our list archives? > > http://archives.postgresql.org > >