BUG #13446: pg_dump fails with large tuples
| От | cpt@novozymes.com |
|---|---|
| Тема | BUG #13446: pg_dump fails with large tuples |
| Дата | |
| Msg-id | 20150616112041.2735.95092@wrigleys.postgresql.org обсуждение исходный текст |
| Ответы |
Re: BUG #13446: pg_dump fails with large tuples
|
| Список | pgsql-bugs |
The following bug has been logged on the website:
Bug reference: 13446
Logged by: CPT
Email address: cpt@novozymes.com
PostgreSQL version: 9.3.5
Operating system: Linux, Ubuntu 12, 64-bit
Description:
It looks to me like pg_dump is limited to 1GB per row as a textual
representation.
# create table stringtest (test text);
CREATE TABLE
# insert into stringtest select repeat('A', (1024*2014*510));
INSERT 1
# alter table stringtest add test2 text;
ALTER TABLE
# update stringtest set test2 = test;
UPDATE 1
# \q
$
So far so good.... Now let's try to back this up using pg_dump:
$ pg_dump ... -t stringtest
...
pg_dump: Dumping the contents of table "stringtest" failed: PQgetResult()
failed.
pg_dump: Error message from server: ERROR: out of memory
DETAIL: Cannot enlarge string buffer containing 1051791361 bytes by
1051791360 more bytes.
pg_dump: The command was: COPY public.stringtest (test, test2) TO stdout;
This message then shows up in the server logs. It looks like maybe pg_dump
is limited to exactly 1GB textual representation?
В списке pgsql-bugs по дате отправления: