Tom Lane wrote in message <16225.937319132@sss.pgh.pa.us>...
>although we still need to do some rejiggering of the cost estimation
>rules; current sources are probably *too* eager to use an indexscan.
>
I did some testing today on a 1.6 million row table of random integers
in the range of 0..32767. Using explain I could get a "select max(f1)..."
down to a cost of about 30000 using a where clause of "f1 > 0"...
After running the queries I achieved the following results (dual P133,
w/ 128 megs ram, IDE)...
select max(f1) from t1 [68 seconds] [explain cost 60644.00] select max(f1) from t1 where f1 > 0 [148 seconds]
[explaincost
30416.67]
Knowing my data does have at least one value above 30000 I can apply a
better heuristic other than f1 > 0
select max(f1) from t1 where f1 > 30000 [12.43 seconds] [explain cost
30416.67]
Until more agg. function optimizations are implemented, programmers
might have to use the old melon to speed things up.
Damond