bytea columns and large values

Поиск
Список
Период
Сортировка
От David North
Тема bytea columns and large values
Дата
Msg-id 4E82014C.6070501@corefiling.co.uk
обсуждение исходный текст
Ответы Re: bytea columns and large values
Re: bytea columns and large values
Список pgsql-general
My application uses a bytea column to store some fairly large binary
values (hundreds of megabytes).

Recently I've run into a problem as my values start to approach the 1GB
limit on field size:

When I write a 955MB byte array from Java into my table from JDBC, the
write succeeds and the numbers look about right:

testdb=# select count(*) from problem_table;
  count
-------
      1
(1 row)

testdb=# select pg_size_pretty(pg_total_relation_size('problem_table'));
  pg_size_pretty
----------------
  991 MB
(1 row)

However, any attempt to read this row back fails:

testdb=# select * from problem_table;
ERROR:  invalid memory alloc request size 2003676411

The same error occurs when reading from JDBC (even using getBinaryStream).

Is there some reason why my data can be stored in <1GB but triggers the
allocation of 2GB of memory when I try to read it back? Is there any
setting I can change or any alternate method of reading I can use to get
around this?

Thanks,

--
David North, Software Developer, CoreFiling Limited
http://www.corefiling.com
Phone: +44-1865-203192


В списке pgsql-general по дате отправления:

Предыдущее
От: Jason Long
Дата:
Сообщение: Identifying old/unused views and table
Следующее
От: Marquis103
Дата:
Сообщение: Re: Problem with the 9.1 one-click installer Windows7 64bit