Re: best practice in archiving CDR data

Поиск
Список
Период
Сортировка
От Edgardo Portal
Тема Re: best practice in archiving CDR data
Дата
Msg-id hoqc8n$ieb$1@news.eternal-september.org
обсуждение исходный текст
Ответ на best practice in archiving CDR data  (Juan Backson <juanbackson@gmail.com>)
Ответы Re: best practice in archiving CDR data  (David Fetter <david@fetter.org>)
Список pgsql-general
On 2010-03-29, Juan Backson <juanbackson@gmail.com> wrote:
> --0016e64ccb10fb54050482f07924
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi,
>
> I am using Postgres to store CDR data for voip switches.  The data size
> quickly goes about a few TBs.
>
> What I would like to do is to be able to regularly archive the oldest data
> so only the most recent 6 months of data is available.
>
> All those old data will be stored in a format that can be retrieved back
> either into DB table or flat files.
>
> Does anyone know how should I go about doing that?  Is there any existing
> tool that can already do that?
>
> thanks,
> jb

FWIW, I partition by ISO week, use INSERT RULEs to route CDRs to the correct
partition (keeping about 3 partitions "open" to new CDRs at any one time),
use pg_dump to archive partition tables to off-line storage, and
DROP TABLE to keep the main DBs to about 40 weeks of data. I used
to use monthly partitioning, but the file sizes got a bit awkward
to deal with.

When I need to restore old CDRs (e.g. to service a subpoena) I
use pg_restore to load the needed CDRs to a throwaway database
and process as necessary.

В списке pgsql-general по дате отправления:

Предыдущее
От: Pavel Stehule
Дата:
Сообщение: Re: Splitting text column to multiple rows
Следующее
От: Rick Casey
Дата:
Сообщение: Re: optimizing import of large CSV file into partitioned table?