Re: Optimizing sum() operations
От | Sean Davis |
---|---|
Тема | Re: Optimizing sum() operations |
Дата | |
Msg-id | 264855a00810030451o66171b72uc96e7f302dfd7052@mail.gmail.com обсуждение исходный текст |
Ответ на | Optimizing sum() operations ("Dobes Vandermeer" <dobesv@gmail.com>) |
Ответы |
Re: Optimizing sum() operations
|
Список | pgsql-novice |
On Fri, Oct 3, 2008 at 4:51 AM, Dobes Vandermeer <dobesv@gmail.com> wrote: > I'm currently using sum() to compute historical values in reports; > basically select sum(amount) on records where date <= '...' and date >>= '...' who = X. > > Of course, I have an index on the table for who and date, but that > still leaves potentially thousands of rows to scan. > > First, should I be worried about the performance of this, or will > postgres sum a few thousand rows in a few milliseconds on a decent > system anyway? > > Second, if this is a concern, is there a best practice for optimizing > these kinds of queries? You'll need to test to see what performance you get. That said, indexing is a good place to start. You can always run explain and explain analyze on the queries to double-check the planner. Sean
В списке pgsql-novice по дате отправления: