Tom,
Thanks for the reply. You are correct, I don't really care about
the behavior at less than the maximum number of rows. I guess I was
just a bit mystified by the difference in behavior. As of now, I stop
worrying about it.
-=beeky
Tom Lane wrote:
> "Wm.A.Stafford" <stafford@marine.rutgers.edu> writes:
>
>> I looked at the plan for both versions of the query and, as Albe
>> suggested, the plans were different. I don't know how to analyze the
>> plans but I'm guessing that when the number of records returned gets
>> larger setup time is an increasingly smaller part of the entire process
>> so more time can be spent on setup. This would result in the apparent
>> inactivity of the application running the query when limit went from
>> 850,000 to 1,000,000
>>
>
> Yeah, it's supposed to do that. The question you really need to be
> asking is what's the total elapsed time to do the whole query. If
> there's a serious slowdown then the planner is choosing to cut over
> too soon, which could probably be improved by fooling with the cost
> parameter.
>
> But, do you really care about the behavior with intermediate LIMIT
> values? I would think that you might have a small limit on this query
> while doing testing, but when you go to do the real data processing
> you won't have a LIMIT at all.
>
> regards, tom lane
>