Re: Using Postgres to store high volume streams of sensor readings

Поиск
Список
Период
Сортировка
От Scott Marlowe
Тема Re: Using Postgres to store high volume streams of sensor readings
Дата
Msg-id dcc563d10811221709s38d4e65ex8c821ef65491cf5e@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Using Postgres to store high volume streams of sensor readings  (Scara Maccai <m_lists@yahoo.it>)
Ответы Re: Using Postgres to store high volume streams of sensor readings  ("Ciprian Dorin Craciun" <ciprian.craciun@gmail.com>)
Список pgsql-general
On Sat, Nov 22, 2008 at 5:54 PM, Scara Maccai <m_lists@yahoo.it> wrote:
> Since you always need the timestamp in your selects, have you tried indexing only the timestamp field?
> Your selects would be slower, but since client and sensor don't have that many distinct values compared to the number
ofrows you are inserting maybe the difference in selects would not be that huge. 

Even better might be partitioning on the timestamp.  IF all access is
in a certain timestamp range it's usually a big win, especially
because he can move to a new table every hour / day / week or whatever
and merge the old one into a big "old data" table.

В списке pgsql-general по дате отправления:

Предыдущее
От: Scara Maccai
Дата:
Сообщение: Re: Using Postgres to store high volume streams of sensor readings
Следующее
От: "V S P"
Дата:
Сообщение: [Q]updating multiple rows with Different values