Re: Strategies/Best Practises Handling Large Tables

Поиск
Список
Период
Сортировка
От Igor Romanchenko
Тема Re: Strategies/Best Practises Handling Large Tables
Дата
Msg-id CAP95Gq=ypN-mapDnYBnXeGfJUMgXtyonfLQ8j=yC3r_tGNiPDQ@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Strategies/Best Practises Handling Large Tables  (Chitra Creta <chitracreta@gmail.com>)
Список pgsql-general


On Thu, Nov 15, 2012 at 1:34 PM, Chitra Creta <chitracreta@gmail.com> wrote:
Thanks for your example Chris. I will look into it as a long-term solution.

Partitioning tables as a strategy worked very well indeed. This will be my short/medium term solution. 

Another strategy that I would like to evaluate as a short/medium term solution is archiving old records in a table before purging them.

I am aware that Oracle has a tool that allows records to be exported into a file / archive table before purging them. They also provide a tool to import these records.

Does PostgreSQL have similar tools to export to a file and re-import? 

If PostgreSQL does not have a tool to do this, does anyone have any ideas on what file format (e.g. text file containing a table of headers being column names and rows being records) would be ideal for easy re-importing into a PostgreSQL table?

Thank you for your ideas.

PostgreSQL has COPY TO to export records to a file ( http://wiki.postgresql.org/wiki/COPY ).

В списке pgsql-general по дате отправления:

Предыдущее
От: sk baji
Дата:
Сообщение: Re: How to list all schema names inside a PostgreSQL database through SQL
Следующее
От: Sébastien Lardière
Дата:
Сообщение: Plproxy with returns table() make PG segfault