On Tue, Jan 07, 2003 at 11:14:46AM -0200, Felipe Schnack wrote:
> So, generally selecting all rows from a table an fetching only the
> first one is probably faster than limiting the query to its first row?
>
> On Tue, 2003-01-07 at 11:14, terry@ashtonwoodshomes.com wrote:
> > Further, I think if your query has an order by clause then the whole query
> > is executed, sorted, then all but the limit'd rows are truncated. Hence
> > there is no performance improvement.
> >
> > A very VERY smart database engine could perhaps in some cases use an index
> > to determine in advance the sort and get the rows in the correct order, and
> > hence stop when the limit was reached. But that would be a rare case at
> > best, and I doubt anyone has gone to the brain damage of implementing such
> > complexity considering the very limited payback.
Well, I guess that makes postgresql a very VERY smart database engine. If
you give no limit, postgresql will base it's planning on retreiving all
rows. If you specify a LIMIT, it will plan on only calculating those rows.
Play with EXPLAIN and LIMIT and very large tables with indexes. It's fairly
easy to demonstrate.
--
Martijn van Oosterhout <kleptog@svana.org> http://svana.org/kleptog/
> Support bacteria! They're the only culture some people have.