Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects

Поиск
Список
Период
Сортировка
От Giuseppe Broccolo
Тема Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Дата
Msg-id 524AAB93.7070308@2ndquadrant.it
обсуждение исходный текст
Ответ на PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (Sergey Klochkov <klochkov@iqbuzz.ru>)
Ответы Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (Sergey Klochkov <klochkov@iqbuzz.ru>)
Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (bricklen <bricklen@gmail.com>)
Список pgsql-admin
Maybe you can performe your database changing some parameters properly:
>
> PostgreSQL configuration:
>
> listen_addresses = '*'          # what IP address(es) to listen on;
> port = 5432                             # (change requires restart)
> max_connections = 500                   # (change requires restart)
Set it to 100, the highest value supported by PostgreSQL
> shared_buffers = 16GB                  # min 128kB
This value should not be higher than 8GB
> temp_buffers = 64MB                     # min 800kB
> work_mem = 512MB                        # min 64kB
> maintenance_work_mem = 30000MB          # min 1MB
Given RAM 96GB, you could set it up to 4800MB
> checkpoint_segments = 70                # in logfile segments, min 1,
> 16MB each
> effective_cache_size = 50000MB
Given RAM 96GB, you could set it up to 80GB
>

Hope it can help.

Giuseppe.

--
Giuseppe Broccolo - 2ndQuadrant Italy
PostgreSQL Training, Services and Support
giuseppe.broccolo@2ndQuadrant.it | www.2ndQuadrant.it



В списке pgsql-admin по дате отправления:

Предыдущее
От: Sergey Klochkov
Дата:
Сообщение: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Следующее
От: Sergey Klochkov
Дата:
Сообщение: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects