Обсуждение: Vacuum - Out of memory

Поиск
Список
Период
Сортировка

Vacuum - Out of memory

От
"Tomeh, Husam"
Дата:

I'm getting out of memory error when I try to vacuum a table (any table
in fact).

=# vacuum analyze code;
ERROR:  out of memory
DETAIL:  Failed on request of size 1073741820.

I'm running Postgres 8.1.1 on RedHat 2.6 kernel (HP server). I never had
this error before and I ran vacuum every night regularly.
When I ran the vacuum on a different server with the same identical
(cloned) database and settings, vacuum works fine with no error.

My maintenance work area never been changed. It's set to 1GB.
(maintenance_work_mem = 1048576).
Physical memory: 32 GB.  Top utilities shows:

Mem:  32752736k total, 14948340k used, 17804396k free,   122732k buffers
Swap:  2144668k total,   165140k used,  1979528k free, 14265048k cached

I bounced the database and still getting the same error. I could try
decreasing the maintenance_work_mem but I have plenty of memory and I
need it to process my nightly jobs faster. Any idea why I'm getting
out-of-memory error?  Thanks,
 __

Husam

**********************************************************************
This message contains confidential information intended only for the use of the addressee(s) named above and may
containinformation that is legally privileged.  If you are not the addressee, or the person responsible for delivering
itto the addressee, you are hereby notified that reading, disseminating, distributing or copying this message is
strictlyprohibited.  If you have received this message by mistake, please immediately notify us by replying to the
messageand delete the original message immediately thereafter. 

Thank you.

                                   FADLD Tag
**********************************************************************


Re: Vacuum - Out of memory

От
Tom Lane
Дата:
"Tomeh, Husam" <htomeh@firstam.com> writes:
> Any idea why I'm getting out-of-memory error?

Looks like a corrupt-data problem, but it's hard to tell any more than
that with only this much info.  If it affects all tables then the
corruption is probably in a system catalog rather than every one of
the tables.  You might try setting a breakpoint at errfinish to get
a stack trace from the point of the error; that would give us at least
a bit of a clue where the problem is.

            regards, tom lane

Re: Vacuum - Out of memory

От
"Tomeh, Husam"
Дата:
ThanksTom. I've asked the SA to bounce the server thinking it may be a
memory fragmentation issue on the server. After bouncing the server, I
was able to run vacuum successfully. (Before bouncing the server, I
tried a lower value of maintenance_work_mem of 512MB and vacuum worked
fine which made me feel better the status of data not being corrupted).
Do you agree with me at this point that it's not a data corruption
issue? Thanks again.

 __

Husam Tomeh

-----Original Message-----
From: Tom Lane [mailto:tgl@sss.pgh.pa.us]
Sent: Monday, January 30, 2006 6:01 PM
To: Tomeh, Husam
Cc: pgsql-admin@postgresql.org
Subject: Re: [ADMIN] Vacuum - Out of memory

"Tomeh, Husam" <htomeh@firstam.com> writes:
> Any idea why I'm getting out-of-memory error?

Looks like a corrupt-data problem, but it's hard to tell any more than
that with only this much info.  If it affects all tables then the
corruption is probably in a system catalog rather than every one of
the tables.  You might try setting a breakpoint at errfinish to get
a stack trace from the point of the error; that would give us at least
a bit of a clue where the problem is.

            regards, tom lane
**********************************************************************
This message contains confidential information intended only for the use of the addressee(s) named above and may
containinformation that is legally privileged.  If you are not the addressee, or the person responsible for delivering
itto the addressee, you are hereby notified that reading, disseminating, distributing or copying this message is
strictlyprohibited.  If you have received this message by mistake, please immediately notify us by replying to the
messageand delete the original message immediately thereafter. 

Thank you.

                                   FADLD Tag
**********************************************************************


Re: Vacuum - Out of memory

От
Tom Lane
Дата:
"Tomeh, Husam" <htomeh@firstam.com> writes:
> ThanksTom. I've asked the SA to bounce the server thinking it may be a
> memory fragmentation issue on the server. After bouncing the server, I
> was able to run vacuum successfully. (Before bouncing the server, I
> tried a lower value of maintenance_work_mem of 512MB and vacuum worked
> fine which made me feel better the status of data not being corrupted).
> Do you agree with me at this point that it's not a data corruption
> issue? Thanks again.

No, I don't, unless you have fields in the database that *should* be
1-billion-and-change bytes long.

            regards, tom lane