Re: inserting huge file into bytea cause out of memory

Поиск
Список
Период
Сортировка
От Chris Travers
Тема Re: inserting huge file into bytea cause out of memory
Дата
Msg-id CAKt_Zfs5kGJ85KMb-iAYwsv2z-OZN-WYikCn_AefUYpU5Rmb-Q@mail.gmail.com
обсуждение исходный текст
Ответ на Re: inserting huge file into bytea cause out of memory  (liuyuanyuan <liuyuanyuan@highgo.com.cn>)
Список pgsql-general


On Wed, Aug 7, 2013 at 6:41 PM, liuyuanyuan <liuyuanyuan@highgo.com.cn> wrote:

     
      Thanks for your last reply!
      I've test Large Object ( oid type ), and it seems better on out of memory.
      But, for the out of memory problem of bytea, we really have no idea to
solve it ? Why there's no way to solve it ? Is this a problem of JDBC ,or the type itself ?

I think the big difficulty efficiency-wise is in the fact that everything is exchanged in a textual representation.  This means you have likely at least two representations in memory on the client and the server, and maybe more depending on the client framework, and the textual representation is around twice as large as the binary one.  Add to this the fact that it must all be handled at once and you have difficulties which are inherent to the implementation.  In general, I do not recommend byteas for large amounts of binary data for that reason.  If your files are big, use lobs.

Best Wishes,
Chris Travers 
   
Yours,
Liu Yuanyuan



--
Best Wishes,
Chris Travers

Efficito:  Hosted Accounting and ERP.  Robust and Flexible.  No vendor lock-in.

В списке pgsql-general по дате отправления:

Предыдущее
От: Sergey Konoplev
Дата:
Сообщение: Re: Self referencing composite datatype
Следующее
От: Tom Lane
Дата:
Сообщение: Re: Adding ip4r to Postgresql core?