Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects

Поиск
Список
Период
Сортировка
От bricklen
Тема Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Дата
Msg-id CAGrpgQ_safsytHcJyBwo2fT6Eu01=hJwjiZ2juac1vJQRqCjfg@mail.gmail.com
обсуждение исходный текст
Ответ на Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (Giuseppe Broccolo <giuseppe.broccolo@2ndquadrant.it>)
Список pgsql-admin

On Tue, Oct 1, 2013 at 4:01 AM, Giuseppe Broccolo <giuseppe.broccolo@2ndquadrant.it> wrote:
Maybe you can performe your database changing some parameters properly:

max_connections = 500                   # (change requires restart)
Set it to 100, the highest value supported by PostgreSQL

Surely you mean that  max_connections = 100 is the *default* ?

В списке pgsql-admin по дате отправления:

Предыдущее
От: Sergey Klochkov
Дата:
Сообщение: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Следующее
От: Magnus Hagander
Дата:
Сообщение: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects