Re: Problem w/ dumping huge table and no disk space

Поиск
Список
Период
Сортировка
От Andrew Gould
Тема Re: Problem w/ dumping huge table and no disk space
Дата
Msg-id 20010907215210.55121.qmail@web13403.mail.yahoo.com
обсуждение исходный текст
Ответ на Problem w/ dumping huge table and no disk space  (David Ford <david@blue-labs.org>)
Список pgsql-general
Have you tried dumping individual tables separately
until it's all done?

I've never used to -Z option, so I can't compare its
compression to piping a pg_dump through gzip.
However, this is how I've been doing it:

pg_dump db_name | gzip -c > db_name.gz

I have a 2.2 Gb database that gets dumped/compressed
to a 235 Mb file.

Andrew

--- David Ford <david@blue-labs.org> wrote:
> Help if you would please :)
>
> I have a 10million+ row table and I've only got a
> couple hundred megs
> left.  I can't delete any rows, pg runs out of disk
> space and crashes.
>  I can't pg_dump w/ compressed, the output file is
> started, has the
> schema and a bit other info comprising about 650
> bytes, runs for 30
> minutes and pg runs out of disk space and crashes.
> My pg_dump cmd is:
> "pg_dump -d -f syslog.tar.gz -F c -t syslog -Z 9
> syslog".
>
> I want to dump this database (entire pgsql dir is
> just over two gigs)
> and put it on another larger machine.
>
> I can't afford to lose this information, are there
> any helpful hints?
>
> I'll be happy to provide more information if
> desired.
>
> David
>
>
>
> ---------------------------(end of
> broadcast)---------------------------
> TIP 4: Don't 'kill -9' the postmaster


__________________________________________________
Do You Yahoo!?
Get email alerts & NEW webcam video instant messaging with Yahoo! Messenger
http://im.yahoo.com

В списке pgsql-general по дате отправления:

Предыдущее
От: Tom Lane
Дата:
Сообщение: Re: Problem w/ dumping huge table and no disk space
Следующее
От: Micah Yoder
Дата:
Сообщение: Re: Great Bridge ceases operations