Re: Rapidly finding maximal rows

Поиск
Список
Период
Сортировка
От Tom Lane
Тема Re: Rapidly finding maximal rows
Дата
Msg-id 13279.1318566353@sss.pgh.pa.us
обсуждение исходный текст
Ответ на Rapidly finding maximal rows  (James Cranch <jdc41@cam.ac.uk>)
Список pgsql-performance
James Cranch <jdc41@cam.ac.uk> writes:
> I have a slow query. I expect that there is either a more intelligent way
> to write it, or that one could make some indexes that would speed it up.
> I've tried various indexes, and am not getting anywhere.
> I'd be grateful for any suggestions. Reasonably full details are below.

Two bits of advice:

1. Avoid unnecessary use of WITH.  It acts as an optimization fence,
which you don't want here.  In particular, the only way to avoid
sorting/aggregating over the whole table is for the outer query's WHERE
conditions on competition_name and academic_year_beginning to get pushed
down into the scans on challenge_entries ... and that can't happen if
there's a WITH in between.

2. Try increasing work_mem.  I think that your first view would work all
right if it had enough work_mem to go for a HashAgg plan instead of
sort-and-group, even without pushdown of the outer WHERE.  It'd
definitely be faster than what you've got, anyway.

The other approach with a window function is probably a lost cause.
Postgres hasn't got a lot of intelligence about optimizing
window-function queries ...

            regards, tom lane

В списке pgsql-performance по дате отправления:

Предыдущее
От: Svetlin Manavski
Дата:
Сообщение: Join over two tables of 50K records takes 2 hours
Следующее
От: Tom Lane
Дата:
Сообщение: Re: Join over two tables of 50K records takes 2 hours