Backup/Restore of single table in multi TB database

Поиск
Список
Период
Сортировка
Hi,

I have a large database (multiple TBs) where I'd like to be able to do
a backup/restore of just a particular table (call it foo).  Because
the database is large, the time for a full backup would be
prohibitive.  Also, whatever backup mechanism we do use needs to keep
the system online (i.e., users must still be allowed to update table
foo while we're taking the backup).

After reading the documentation, it seems like the following might
work.  Suppose the database has two tables foo and bar, and we're only
interested in backing up table foo:

1. Call pg_start_backup

2. Use the pg_class table in the catalog to get the data file names
for tables foo and bar.

3. Copy the system files and the data file for foo.  Skip the data file for bar.

4. Call pg_stop_backup()

5. Copy WAL files generated between 1. and 4. to another location.

Later, if we want to restore the database somewhere with just table
foo, we just use postgres's normal recovery mechanism and point it at
the files we backed up in 2. and the WAL files from 5.

Does anyone see a problem with this approach (e.g., correctness,
performance, etc.)?  Or is there perhaps an alternative approach using
some other postgresql mechanism that I'm not aware of?

Thanks!
- John

В списке pgsql-general по дате отправления:

Предыдущее
От: "Pau Marc Munoz Torres"
Дата:
Сообщение: Problems with memory
Следующее
От: "David Wilson"
Дата:
Сообщение: Re: Backup/Restore of single table in multi TB database