Обсуждение: out of memory for query, partitioning & vacuuming

Поиск
Список
Период
Сортировка

out of memory for query, partitioning & vacuuming

От
"Lee, Mija"
Дата:
Hi -
I'm not a particularly experienced dba, so I'm hoping this isn't a
ridiculous question.
I have a 5 GB table with lots of churn in a 14 GB database. Querying
this one table without limits has just started throwing "out of memory
for query" from multiple clients (psql, java). I'm working with the
developers to look at it from the client side, but I'm wondering if
either partitioning the table or more aggressive vacuuming of this table
would have an effect on the out of memory problem.
I'm running 8.3.5 on solaris 10.
Thanks!
Mija

CONFIDENTIALITY NOTICE: This e-mail message, including any attachments,
is for the sole use of the intended recipient(s) and may contain
confidential, proprietary, and/or privileged information protected by
law. If you are not the intended recipient, you may not use, copy, or
distribute this e-mail message or its attachments. If you believe you
have received this e-mail message in error, please contact the sender by
reply e-mail and destroy all copies of the original message.


Re: out of memory for query, partitioning & vacuuming

От
Scott Marlowe
Дата:
On Wed, Jul 15, 2009 at 12:59 PM, Lee, Mija<mija@scharp.org> wrote:
> Hi -
> I'm not a particularly experienced dba, so I'm hoping this isn't a
> ridiculous question.
> I have a 5 GB table with lots of churn in a 14 GB database. Querying
> this one table without limits has just started throwing "out of memory
> for query" from multiple clients (psql, java).

If you're getting this error on the client side (i.e. it's not message
coming from pgsql, it's coming from the client software) nothing you
do on the server configuration / partitioning wise is going to help.

Accessing the data via a cursor is usually the best way around that error.

Re: out of memory for query, partitioning & vacuuming

От
Anj Adu
Дата:
What are your work_mem settings ? Work_mem limits the amount of memory used before using the disk. You may have a large value and a few sessions may end up using all the available memory.

Read this on work_mem

http://www.postgresql.org/docs/8.3/static/runtime-config-resource.html

On Wed, Jul 15, 2009 at 11:59 AM, Lee, Mija <mija@scharp.org> wrote:
Hi -
I'm not a particularly experienced dba, so I'm hoping this isn't a
ridiculous question.
I have a 5 GB table with lots of churn in a 14 GB database. Querying
this one table without limits has just started throwing "out of memory
for query" from multiple clients (psql, java). I'm working with the
developers to look at it from the client side, but I'm wondering if
either partitioning the table or more aggressive vacuuming of this table
would have an effect on the out of memory problem.
I'm running 8.3.5 on solaris 10.
Thanks!
Mija

CONFIDENTIALITY NOTICE: This e-mail message, including any attachments,
is for the sole use of the intended recipient(s) and may contain
confidential, proprietary, and/or privileged information protected by
law. If you are not the intended recipient, you may not use, copy, or
distribute this e-mail message or its attachments. If you believe you
have received this e-mail message in error, please contact the sender by
reply e-mail and destroy all copies of the original message.


--
Sent via pgsql-admin mailing list (pgsql-admin@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-admin

Re: out of memory for query, partitioning & vacuuming

От
"Lee, Mija"
Дата:

My work_mem setting is the default (1MB), but I’m doing a simple “select * from schema.table”  - - no joins, order by, distinct, hashes in the statement – so I dn’t think that’s it.

 

I’m getting the error in psql, so based on what Scott said, it sounds like there’s nothing to do server side.

 

Thanks!

 


From: Anj Adu [mailto:fotographs@gmail.com]
Sent: Wednesday, July 15, 2009 12:18 PM
To: Lee, Mija
Cc: pgsql-admin@postgresql.org
Subject: Re: [ADMIN] out of memory for query, partitioning & vacuuming

 

What are your work_mem settings ? Work_mem limits the amount of memory used before using the disk. You may have a large value and a few sessions may end up using all the available memory.

Read this on work_mem

http://www.postgresql.org/docs/8.3/static/runtime-config-resource.html

On Wed, Jul 15, 2009 at 11:59 AM, Lee, Mija <mija@scharp.org> wrote:

Hi -
I'm not a particularly experienced dba, so I'm hoping this isn't a
ridiculous question.
I have a 5 GB table with lots of churn in a 14 GB database. Querying
this one table without limits has just started throwing "out of memory
for query" from multiple clients (psql, java). I'm working with the
developers to look at it from the client side, but I'm wondering if
either partitioning the table or more aggressive vacuuming of this table
would have an effect on the out of memory problem.
I'm running 8.3.5 on solaris 10.
Thanks!
Mija

CONFIDENTIALITY NOTICE: This e-mail message, including any attachments,
is for the sole use of the intended recipient(s) and may contain
confidential, proprietary, and/or privileged information protected by
law. If you are not the intended recipient, you may not use, copy, or
distribute this e-mail message or its attachments. If you believe you
have received this e-mail message in error, please contact the sender by
reply e-mail and destroy all copies of the original message.


--
Sent via pgsql-admin mailing list (pgsql-admin@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-admin