* Robert Haas (robertmhaas@gmail.com) wrote:
> I'm interested, but I need maybe a 1GB data set, or smaller. The
> thing that we are benchmarking is the planner, and planning times are
> related to the complexity of the database and the accompanying
> queries, not the raw volume of data. (It's not size that matters,
> it's how you use it?) In fact, in a large database, one could argue
> that there is less reason to care about the planner, because the
> execution time will dominate anyway. I'm interested in complex
> queries in web/OLTP type applications, where you need the query to be
> planned and executed in 400 ms at the outside (and preferably less
> than half of that).
We prefer that our geocoding be fast... :) Doing 1 state should give
you about the right size (half to 1G of data). I'll try to put together
a good test set this week.
Stephen