vacuumdb failed

Поиск
Список
Период
Сортировка
От George Robinson II
Тема vacuumdb failed
Дата
Msg-id 39A6BF17.9D60C9E@eurekabroadband.com
обсуждение исходный текст
Ответы Re: vacuumdb failed  (Tom Lane <tgl@sss.pgh.pa.us>)
Список pgsql-general
    Last night, while my perl script was doing a huge insert operation, I
got this error...

DBD::Pg::st execute failed: ERROR:  copy: line 4857, pg_atoi: error
reading "2244904358": Result too large

    Now, I'm not sure if this is related, but while trying to do vacuumdb
<dbname>, I got...

NOTICE:  FlushRelationBuffers(all_flows, 500237): block 171439 is
referenced (private 0, global 1)
FATAL 1:  VACUUM (vc_repair_frag): FlushRelationBuffers returned -2
pqReadData() -- backend closed the channel unexpectedly.
        This probably means the backend terminated abnormally
        before or while processing the request.
connection to server was lost
vacuumdb: vacuum failed

    Any ideas?  I'm trying a couple other things right now.  By the way,
this database has one table that is HUGE.  What is the limit on table
size in postgresql7?  The faq says unlimited.  If that's true, how do
you get around the 2G file size limit that (at least) I have in solaris
2.6?

Thank you.

-g2

В списке pgsql-general по дате отправления:

Предыдущее
От: "Martin A. Marques"
Дата:
Сообщение: Re: alter table and constraints
Следующее
От: "Ross J. Reedstrom"
Дата:
Сообщение: Re: Creating a DB for another user (or...) (repost attempt)