Обсуждение: pgdump, large objects and 7.0->7.1

Поиск
Список
Период
Сортировка

pgdump, large objects and 7.0->7.1

От
Philip Crotwell
Дата:
Hi

I am having problems with large objects in 7.0.3, high disk usage, slow
access and deletes of large objects and occasional selects that hang with
the background process going to 98% of the CPU and staying there. Having
read that there are alot of large object improvements in 7.1, I was
thinking of trying the beta out to see if these problems would disappear.

But, 7.0->7.1 needs a pgdumpall/restore. Which wouldn't be a problem, but
pgdumpall in 7.0 doesn't dump large objects. :(

So, 3 questions that basically boil down to "What is the best way to move
large objects from 7.0 to 7.1."

1) Can I use the 7.1 pgdumpall to dump a 7.0.3 database? The docs say no,
but worth a try.

2) What does "large objects... must be handled manually" in the 7.0 pgdump
docs mean? Does this mean that there is a way to manually copy the
xinvXXXX files? I have ~23000 of them at present.

3) Do I need to preserve oid's when with pgdump using large objects?

thanks,
Philip

PS It would be great if something about this could be added to the 7.1
docs. I would guess that others will have this same problem when 7.1 is
released.



Re: pgdump, large objects and 7.0->7.1

От
Tom Lane
Дата:
Philip Crotwell <crotwell@seis.sc.edu> writes:
> So, 3 questions that basically boil down to "What is the best way to move
> large objects from 7.0 to 7.1."

> 1) Can I use the 7.1 pgdumpall to dump a 7.0.3 database? The docs say no,
> but worth a try.

The docs say no, and they mean no.

There is a new contrib utility that can be used to pull large objects
from a 7.0 (or older) database.  Unfortunately the version that's in
contrib right now will only talk to 7.1 :-(.  I've attached a tarfile
for a version that works with 7.0.*.

> 3) Do I need to preserve oid's when with pgdump using large objects?

No, pg_dumplo takes care of that.

            regards, tom lane



Вложения