Re: Streaming large data into postgres [WORM like applications]

Поиск
Список
Период
Сортировка
От Kevin Hunter
Тема Re: Streaming large data into postgres [WORM like applications]
Дата
Msg-id 464689F8.4090608@earlham.edu
обсуждение исходный текст
Ответ на Re: Streaming large data into postgres [WORM like applications]  ("Dhaval Shah" <dhaval.shah.m@gmail.com>)
Список pgsql-general
At 8:49p on 12 May 2007, Dhaval Shah wrote:
> That leads to the question, can the data be compressed? Since the data
> is very similar, any compression would result in some 6x-10x
> compression. Is there a way to identify which partitions are in which
> data files and compress them until they are actually read?

There was a very interesting article in ;login: magazine in April of
this year discussing how they dealt with an exorbitant amount of largely
similar data.  The article claimed that through aggregation and gzip
compression, they were able to reduce what they needed to store by
roughly 350x, or about .7 bytes per 'event'.  The article is

The Secret Lives of Computers Exposed: Flight Data Recorder for Windows
by Chad Verbowski

You might try to get your mitts on that article for some ideas.  I'm not
sure you could apply any of their ideas directly to the Postgres backend
data files, but perhaps somewhere in your pipeline.

Kevin

В списке pgsql-general по дате отправления:

Предыдущее
От: "Dhaval Shah"
Дата:
Сообщение: Re: Streaming large data into postgres [WORM like applications]
Следующее
От: rdeleonp@gmail.com
Дата:
Сообщение: Re: TWO SAME TABLES, ONE UPDATED. HOW TO SYNC THE OTHER?