Re: Optimizing sum() operations
От | Mark Roberts |
---|---|
Тема | Re: Optimizing sum() operations |
Дата | |
Msg-id | 1223075203.12105.432.camel@localhost обсуждение исходный текст |
Ответ на | Re: Optimizing sum() operations ("Dobes Vandermeer" <dobesv@gmail.com>) |
Ответы |
Re: Optimizing sum() operations
|
Список | pgsql-novice |
> This is a just a general theory that occurred to me, it's probably a > pretty specialized kind of indexing that isn't supported by any RDBMS, > but it's possible there could be a postgres plugin which did this, > though. Has anyone heard of something like that? The problem with this is that there's generally more than one metadata field for each key set - and frequently you might want to sum or max each value. I think the value of having this kind of structure would be quickly mitigated by virtue of increasing node size and slowing node retrieval times. > I think that if there are a lot of rows that match the query, it'll > take a long time, so I thought I'd start inquiring about whether > anyone has a good algorithm for accelerating these kinds of queries. The best solution that I've found for things like this is to look to data warehousing: if you have a frequently used aggregation of facts, then preaggregate (summarize) it and pull from there instead. -Mark
В списке pgsql-novice по дате отправления: