Thousands of schemas and ANALYZE goes out of memory

Поиск
Список
Период
Сортировка
От Hugo
Тема Thousands of schemas and ANALYZE goes out of memory
Дата
Msg-id 1349121151898-5726198.post@n5.nabble.com
обсуждение исходный текст
Ответы Re: Thousands of schemas and ANALYZE goes out of memory  (Tom Lane <tgl@sss.pgh.pa.us>)
Re: Thousands of schemas and ANALYZE goes out of memory  (Jeff Janes <jeff.janes@gmail.com>)
Список pgsql-general
Hi everyone,

We have two postgresql 9.0 databases (32-bits) with more than 10,000
schemas. When we try to run ANALYZE in those databases we get errors like
this (after a few hours):

2012-09-14 01:46:24 PDT ERROR:  out of memory
2012-09-14 01:46:24 PDT DETAIL:  Failed on request of size 421.
2012-09-14 01:46:24 PDT STATEMENT:  analyze;

(Note that we do have plenty of memory available for postgresql:
shared_buffers=2048MB, work_mem=128MB, maintenance_work_mem=384MB,
effective_cache_size = 3072MB, etc.)

We have other similar databases with less than 10,000 schemas and ANALYZE
works fine with them (they run on similar machines and configs). For now, we
had to create shell scripts to run ANALYZE per schema, table by table. It
works that way, so at least we have an alternative solution. But what
exactly causes the out of memory? Is postgresql trying to run everything in
a single transaction? Maybe this should be improved for the future releases.
Please let me know what you guys think.

Thanks in advance,
Hugo



--
View this message in context:
http://postgresql.1045698.n5.nabble.com/Thousands-of-schemas-and-ANALYZE-goes-out-of-memory-tp5726198.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


В списке pgsql-general по дате отправления:

Предыдущее
От: yary
Дата:
Сообщение: Pg, Netezza, and... Sybase?
Следующее
От: Shaun Thomas
Дата:
Сообщение: Re: Securing .pgpass File?