Обсуждение: pg_dump out of memory

Поиск
Список
Период
Сортировка

pg_dump out of memory

От
Andy Colson
Дата:
Hi All,

I moved a physical box to a VM, and set its memory to 1Gig.  Everything
runs fine except one backup:


/pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep

g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
pg_dump: Error message from server: ERROR:  out of memory
DETAIL:  Failed on request of size 1073741823.
pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
stdout;

I've been reducing my memory settings:

maintenance_work_mem = 80MB
work_mem = 5MB
shared_buffers = 200MB

But it doesnt seem to make a difference.
The tables looks like:

wildfire=# \dt+ ofrrds
                    List of relations
  Schema |  Name  | Type  | Owner | Size  | Description
--------+--------+-------+-------+-------+-------------
  public | ofrrds | table | andy  | 15 MB |


ildfire=# \d ofrrds
               Table "public.ofrrds"
    Column    |          Type          | Modifiers
-------------+------------------------+-----------
  id          | character varying(100) | not null
  updateddate | bigint                 | not null
  bytes       | bytea                  |
Indexes:
     "ofrrds_pk" PRIMARY KEY, btree (id)


wildfire=# select id, length(bytes) from ofrrds;
         id         |  length
-------------------+-----------
  muc_rooms         | 152330241
  packet_count      |  76165121
  muc_users         |  76165121
  sessions          |  76165121
  muc_occupants     |   9520641
  muc_traffic       |
  server_bytes      | 301850625
  proxyTransferRate |  76165121
  server_sessions   | 152330241
  conversations     | 304660481
(10 rows)


I'm not sure how to get this backup to run.  Any hints would be appreciated.

-Andy




Re: pg_dump out of memory

От
Adrian Klaver
Дата:
On 07/03/2018 07:43 PM, Andy Colson wrote:
> Hi All,
> 
> I moved a physical box to a VM, and set its memory to 1Gig.  Everything
> runs fine except one backup:
> 
> 
> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
> 
> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() 
> failed.
> pg_dump: Error message from server: ERROR:  out of memory
> DETAIL:  Failed on request of size 1073741823.
> pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
> stdout;
> 
> I've been reducing my memory settings:
> 
> maintenance_work_mem = 80MB
> work_mem = 5MB
> shared_buffers = 200MB
> 
> But it doesnt seem to make a difference.
> The tables looks like:
> 
> wildfire=# \dt+ ofrrds
>                     List of relations
>   Schema |  Name  | Type  | Owner | Size  | Description
> --------+--------+-------+-------+-------+-------------
>   public | ofrrds | table | andy  | 15 MB |
> 
> 
> ildfire=# \d ofrrds
>                Table "public.ofrrds"
>     Column    |          Type          | Modifiers
> -------------+------------------------+-----------
>   id          | character varying(100) | not null
>   updateddate | bigint                 | not null
>   bytes       | bytea                  |
> Indexes:
>      "ofrrds_pk" PRIMARY KEY, btree (id)
> 
> 
> wildfire=# select id, length(bytes) from ofrrds;
>          id         |  length
> -------------------+-----------
>   muc_rooms         | 152330241
>   packet_count      |  76165121
>   muc_users         |  76165121
>   sessions          |  76165121
>   muc_occupants     |   9520641
>   muc_traffic       |
>   server_bytes      | 301850625
>   proxyTransferRate |  76165121
>   server_sessions   | 152330241
>   conversations     | 304660481
> (10 rows)
> 
> 
> I'm not sure how to get this backup to run.  Any hints would be 
> appreciated.

Maybe:

1) Try:
    pg_dump -t ofrrds
    to dump only that table.

2) If that works then:

    pg_dump -T ofrrds
    to dump everything but that table.

> 
> -Andy
> 
> 
> 
> 


-- 
Adrian Klaver
adrian.klaver@aklaver.com


Re: pg_dump out of memory

От
Andy Colson
Дата:
On 07/03/2018 10:21 PM, Adrian Klaver wrote:
> On 07/03/2018 07:43 PM, Andy Colson wrote:
>> Hi All,
>>
>> I moved a physical box to a VM, and set its memory to 1Gig.  Everything
>> runs fine except one backup:
>>
>>
>> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>>
>> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
>> pg_dump: Error message from server: ERROR:  out of memory
>> DETAIL:  Failed on request of size 1073741823.
>> pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
>> stdout;
>>
>>
>>
>> I'm not sure how to get this backup to run.  Any hints would be appreciated.
> 
> Maybe:
> 
> 1) Try:
>      pg_dump -t ofrrds
>     to dump only that table.
> 

It didnt work.  I get the same error.

Also, I'm running Slackware 14.2, and PG 9.5.11

-Andy




Re: pg_dump out of memory

От
Adrian Klaver
Дата:
On 07/03/2018 08:28 PM, Andy Colson wrote:
> On 07/03/2018 10:21 PM, Adrian Klaver wrote:
>> On 07/03/2018 07:43 PM, Andy Colson wrote:
>>> Hi All,
>>>
>>> I moved a physical box to a VM, and set its memory to 1Gig.  Everything
>>> runs fine except one backup:
>>>
>>>
>>> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>>>
>>> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() 
>>> failed.
>>> pg_dump: Error message from server: ERROR:  out of memory
>>> DETAIL:  Failed on request of size 1073741823.
>>> pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
>>> stdout;
>>>
>>>
>>>
>>> I'm not sure how to get this backup to run.  Any hints would be 
>>> appreciated.
>>
>> Maybe:
>>
>> 1) Try:
>>      pg_dump -t ofrrds
>>     to dump only that table.
>>
> 
> It didnt work.  I get the same error.

Well all I can think of is to give the VM more memory.

> 
> Also, I'm running Slackware 14.2, and PG 9.5.11
> 
> -Andy
> 
> 
> 


-- 
Adrian Klaver
adrian.klaver@aklaver.com


Re: pg_dump out of memory

От
George Neuner
Дата:
On Tue, 3 Jul 2018 21:43:38 -0500, Andy Colson <andy@squeakycode.net>
wrote:

>Hi All,
>
>I moved a physical box to a VM, and set its memory to 1Gig.  Everything
>runs fine except one backup:
>
>
>/pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>
>g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
>pg_dump: Error message from server: ERROR:  out of memory
>DETAIL:  Failed on request of size 1073741823.
                                    ^^^^^^^^^^

pg_dump is trying to allocate 1GB.  Obviously it can't if 1GB is all
you have.


>pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
>stdout;
>
>wildfire=# \dt+ ofrrds
>                    List of relations
>  Schema |  Name  | Type  | Owner | Size  | Description
>--------+--------+-------+-------+-------+-------------
>  public | ofrrds | table | andy  | 15 MB |
>
>
>wildfire=# \d ofrrds
>               Table "public.ofrrds"
>    Column    |          Type          | Modifiers
>-------------+------------------------+-----------
>  id          | character varying(100) | not null
>  updateddate | bigint                 | not null
>  bytes       | bytea                  |
>Indexes:
>     "ofrrds_pk" PRIMARY KEY, btree (id)
>

There must be a heck of a lot of data in that bytea column.


>I'm not sure how to get this backup to run.  Any hints would be appreciated.

As Adrian mentioned already, you're going to have to give it more
memory somehow.  Either more RAM or a big swap file.

George



Re: pg_dump out of memory

От
David Rowley
Дата:
On 4 July 2018 at 14:43, Andy Colson <andy@squeakycode.net> wrote:
> I moved a physical box to a VM, and set its memory to 1Gig.  Everything
> runs fine except one backup:
>
>
> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>
> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
> pg_dump: Error message from server: ERROR:  out of memory
> DETAIL:  Failed on request of size 1073741823.> pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes)
TO
> stdout;

There will be less memory pressure on the server if the pg_dump was
performed from another host. When running pg_dump locally the 290MB
bytea value will be allocated in both the backend process pg_dump is
using and pg_dump itself. Running the backup remotely won't require
the latter to be allocated on the server.

> I've been reducing my memory settings:
>
> maintenance_work_mem = 80MB
> work_mem = 5MB
> shared_buffers = 200MB

You may also get it to work by reducing shared_buffers further.
work_mem won't have any affect, neither will maintenance_work_mem.

Failing that, the suggestions of more RAM and/or swap look good.

-- 
 David Rowley                   http://www.2ndQuadrant.com/
 PostgreSQL Development, 24x7 Support, Training & Services


Re: pg_dump out of memory

От
Andy Colson
Дата:
On 07/04/2018 12:31 AM, David Rowley wrote:
> On 4 July 2018 at 14:43, Andy Colson <andy@squeakycode.net> wrote:
>> I moved a physical box to a VM, and set its memory to 1Gig.  Everything
>> runs fine except one backup:
>>
>>
>> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>>
>> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
>> pg_dump: Error message from server: ERROR:  out of memory
>> DETAIL:  Failed on request of size 1073741823.> pg_dump: The command was: COPY public.ofrrds (id, updateddate,
bytes)TO
 
>> stdout;
> 
> There will be less memory pressure on the server if the pg_dump was
> performed from another host. When running pg_dump locally the 290MB
> bytea value will be allocated in both the backend process pg_dump is
> using and pg_dump itself. Running the backup remotely won't require
> the latter to be allocated on the server.
> 
>> I've been reducing my memory settings:
>>
>> maintenance_work_mem = 80MB
>> work_mem = 5MB
>> shared_buffers = 200MB
> 
> You may also get it to work by reducing shared_buffers further.
> work_mem won't have any affect, neither will maintenance_work_mem.
> 
> Failing that, the suggestions of more RAM and/or swap look good.
> 

Adding more ram to the vm is the simplest option.  I just seems a waste cuz of one backup.

Thanks all.

-Andy