On Thu, Sep 17, 2009 at 1:31 PM, Bill Moran <wmoran@potentialtech.com> wrote:
> In response to Scott Marlowe <scott.marlowe@gmail.com>:
>
>> On Thu, Sep 17, 2009 at 12:56 PM, Alan McKay <alan.mckay@gmail.com> wrote:
>> > Is there any way to limit a query to a certain amount of RAM and / or
>> > certain runtime?
>> >
>> > i.e. automatically kill it if it exceeds either boundary?
>> >
>> > We've finally narrowed down our system crashes and have a smoking gun,
>> > but no way to fix it in the immediate term. This sort of limit would
>> > really help us.
>>
>> Generally speaking work_mem limits ram used. What are your
>> non-default postgresql.conf settings?
>
> work_mem limits memory usage _per_sort_.
>
> A big query can easily have many sorts. Each sort will be limited to
> work_mem memory usage, but the total could be much higher.
>
> The only way I can think is to set a per-process limit in the OS and allow
> the OS to kill a process when it gets out of hand. Not ideal, though.
True, but with a work_mem of 2M, I can't imagine having enough sorting
going on to need 4G of ram. (2000 sorts? That's a lot) I'm betting
the OP was looking at top and misunderstanding what the numbers mean,
which is pretty common really.