Pg_dump very large database
От | Nikolay Mihaylov |
---|---|
Тема | Pg_dump very large database |
Дата | |
Msg-id | 005b01c1c126$68490b20$97e309d9@protos обсуждение исходный текст |
Список | pgsql-general |
Hi all, I have some problems to post this message in adin group, but the problem is quite general. -----Original Message----- From: Nikolay Mihaylov [mailto:pg@nmmm.nu] Sent: Wednesday, February 27, 2002 10:01 AM To: 'pgsql-admin@postgresql.org' Subject: FW: Pg_dump very large database Hi all. I have a database with very large tables - about 2+ GB per table. When I'm using pg_dump tool, it get all memory available, then linux crash (or kernel kills the most of processes including pg_dump) For this reason I made small php script which dump data using 'fetch next in cursor' (PHP is lame for this but is perfect for me). I tried to patch pg_dump, but I cant understand most of the code (never works with pg blobs from C). I'm attaching the files I use in order to share them with all you. Nikolay. P.s. Shell scripts are for calling the php scripts. dump1.sh - create list with shell commands needs to be executed in order to get backup. dump.sh - backup single table P.s. Is there any interest for discussing very large pg database programming and administration or is such list exist? ----------------------------------------------------------- The Reboots are for hardware upgrades, Found more here: http://www.nmmm.nu Nikolay Mihaylov nmmm@nmmm.nu
Вложения
В списке pgsql-general по дате отправления: