Re: Dealing with big tables

Поиск
Список
Период
Сортировка
От Mindaugas
Тема Re: Dealing with big tables
Дата
Msg-id E1Iyo2z-00055w-E1@fenris.runbox.com
обсуждение исходный текст
Ответ на Re: Dealing with big tables  (Gregory Stark <stark@enterprisedb.com>)
Ответы Re: Dealing with big tables
Список pgsql-performance
> What exactly is your goal? Do you need this query to respond in under a
> specific limit? What limit? Do you need to be able to execute many instances
> of this query in less than 5s * the number of executions? Or do you have more
> complex queries that you're really worried about?

  I'd like this query to respond under a specific time limit. 5s now is OK but 50s later for 10000 rows is too slow.

> Both Greenplum and EnterpriseDB have products in this space which let you
> break the query up over several servers but at least in EnterpriseDB's case
> it's targeted towards running complex queries which take longer than this to
> run. I doubt you would see much benefit for a 5s query after the overhead of
> sending parts of the query out to different machines and then reassembling the
> results. If your real concern is with more complex queries they may make sense
> though. It's also possible that paying someone to come look at your database
> will find other ways to speed it up.

  I see. This query also should benefit alot even when run in parallel on one server. Since anyway most time it spends
inwaiting for storage to respond. 

  Also off list I was pointed out about covering indexes in MySQL. But they are not supported in PostgreSQL, aren't
they?

  Mindaugas

В списке pgsql-performance по дате отправления:

Предыдущее
От: Gregory Stark
Дата:
Сообщение: Re: Dealing with big tables
Следующее
От: "Merlin Moncure"
Дата:
Сообщение: Re: Training Recommendations