Re: array size exceeds the maximum allowed (1073741823) when building a json

Поиск
Список
Период
Сортировка
От Nicolas Paris
Тема Re: array size exceeds the maximum allowed (1073741823) when building a json
Дата
Msg-id CA+ssMOSji1OQg3Hby01T8mUkSMcvCOSJtfFC3jjV36MGpq83pg@mail.gmail.com
обсуждение исходный текст
Ответ на Re: array size exceeds the maximum allowed (1073741823) when building a json  (Josh Berkus <josh@agliodbs.com>)
Список pgsql-performance


2016-06-07 15:03 GMT+02:00 Josh Berkus <josh@agliodbs.com>:
On 06/07/2016 08:42 AM, Nicolas Paris wrote:
>     ​​You have to do something different.  Using multiple columns and/or
>     multiple rows might we workable.

​Getting a unique document from multiple rows coming from postgresql is not that easy... The external tools considers each postgresql JSON fields as strings or have to parse it again. Parsing them would add an overhead on the external tool, and I d'say this would be better to build the entire JSON in the external tool. This leads not to use postgresql JSON builder at all, and delegate this job to a tool that is able to deal with > 1GO documents.

 
>
>
> ​Certainly. Kind of disappointing, because I won't find any json builder
> as performant as postgresql.​

That's nice to hear.

> Will this 1GO restriction is supposed to increase in a near future ?​

Not planned, no.  Thing is, that's the limit for a field in general, not
just JSON; changing it would be a fairly large patch.  It's desireable,
but AFAIK nobody is working on it.

Comparing to mongoDB 16MO document limitation 1GO is great (http://tech.tulentsev.com/2014/02/limitations-of-mongodb/)​. But for my use case this is not sufficient.



--
--
Josh Berkus
Red Hat OSAS
(any opinions are my own)

В списке pgsql-performance по дате отправления:

Предыдущее
От: Gerardo Herzig
Дата:
Сообщение: Re: Combination of partial and full indexes
Следующее
От: Антон Бушмелев
Дата:
Сообщение: Re: 9.4 -> 9.5 regression with queries through pgbouncer on RHEL 6