Backup Large Tables

Поиск
Список
Период
Сортировка
От Charles Ambrose
Тема Backup Large Tables
Дата
Msg-id 61ca079e0609211954g572d4ef8hd6ed8bb597eab99e@mail.gmail.com
обсуждение исходный текст
Ответы Re: Backup Large Tables  ("Michael Nolan" <htfoot@gmail.com>)
Re: Backup Large Tables  (Vivek Khera <vivek@khera.org>)
Список pgsql-general
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.

As I see it, I can dump the tables by gradually getting data and dumping it. I plan to select a number of records from the table then dump it to a text file. This process continues until all records in the table are obtained. With this aproach I need a primary key that uniquely identifies each record so that each pass of getting data from the tables will not get data that has already been processed.
Problem with this approach though is that my dumping utility will not be generic.

Are there any alternatives?

Thanks for help in advance.

Thanks!








В списке pgsql-general по дате отправления:

Предыдущее
От: "Brad Budge"
Дата:
Сообщение: Select Cast Error
Следующее
От: "Michael Nolan"
Дата:
Сообщение: Re: Backup Large Tables