Re: Postgres Connections Requiring Large Amounts of Memory

Поиск
Список
Период
Сортировка
От SZŰCS Gábor
Тема Re: Postgres Connections Requiring Large Amounts of Memory
Дата
Msg-id 004a01c33568$d48235c0$0403a8c0@fejleszt4
обсуждение исходный текст
Ответ на Postgres Connections Requiring Large Amounts of Memory  (Dawn Hollingsworth <dmh@airdefense.net>)
Список pgsql-performance
----- Original Message -----
From: "Dawn Hollingsworth" <dmh@airdefense.net>
Sent: Tuesday, June 17, 2003 11:42 AM


> I'm not starting any of my own transactions and I'm not calling stored
> procedures from withing stored procedures. The stored procedures do have
> large parameters lists, up to 100. The tables are from 300 to 500

Geez! I don't think it'll help you find the memory leak (if any), but
couldn't you normalize the tables to smaller ones? That may be a pain when
updating (views and rules),  but I think it'd worth in resources (time and
memory, but maybe not disk space). I wonder what is the maximum number of
updated  cols and the minimum correlation between their semantics in a
single transaction (i.e. one func call), since there are "only" 100 params
for a proc.

> columns. 90% of the columns are either INT4 or INT8.  Some of these
> tables are inherited. Could that be causing problems?

Huh. It's still 30-50 columns (a size of a fairly large table for me) of
other types :)

G.
------------------------------- cut here -------------------------------


В списке pgsql-performance по дате отправления:

Предыдущее
От: Tom Lane
Дата:
Сообщение: Re: Interesting incosistent query timing
Следующее
От: Bruno Wolff III
Дата:
Сообщение: Recent 7.4 change slowed down a query by a factor of 3