Balkrishna Sharma <> wrote:
> I wish to do performance testing of 1000 simultaneous read/write
> to the database.
You should definitely be using a connection pool of some sort. Both
your throughput and response time will be better that way. You'll
want to test with different pool sizes, but I've found that a size
which allows the number of active queries in PostgreSQL to be
somewhere around (number_of_cores * 2) + effective_spindle_count to
be near the optimal size.
> My question is:Am I losing something by firing these queries
> directly off the server and should I look at firing the queries
> from different IP address (as it would happen in a web application).
If you run the client side of your test on the database server, the
CPU time used by the client will probably distort your results. I
would try using one separate machine to generate the requests, but
monitor to make sure that the client machine isn't hitting some
bottleneck (like CPU time). If the client is the limiting factor,
you may need to use more than one client machine. No need to use
1000 different client machines. :-)