Re: JSON for PG 9.2

Поиск
Список
Период
Сортировка
От Andrew Dunstan
Тема Re: JSON for PG 9.2
Дата
Msg-id 4F11E043.1040107@dunslane.net
обсуждение исходный текст
Ответ на Re: JSON for PG 9.2  (Andrew Dunstan <andrew@dunslane.net>)
Ответы Re: JSON for PG 9.2  (Pavel Stehule <pavel.stehule@gmail.com>)
Re: JSON for PG 9.2  (Joey Adams <joeyadams3.14159@gmail.com>)
Re: JSON for PG 9.2  (Andrew Dunstan <andrew@dunslane.net>)
Re: JSON for PG 9.2  (Robert Haas <robertmhaas@gmail.com>)
Список pgsql-hackers

On 01/12/2012 10:51 AM, Andrew Dunstan wrote:
>
>
> On 01/12/2012 10:44 AM, Pavel Stehule wrote:
>> 2012/1/12 Andrew Dunstan<andrew@dunslane.net>:
>>>
>>> On 01/12/2012 09:00 AM, Joey Adams wrote:
>>>> I wrote an array_to_json function during GSoC 2010:
>>>>
>>>>
>>>> http://git.postgresql.org/gitweb/?p=json-datatype.git;a=blob;f=json_io.c#l289
>>>>
>>>>
>>>> It's not exposed as a procedure called array_to_json: it's part of the
>>>> to_json function, which decides what to do based on the argument type.
>>>>
>>>
>>> Excellent, this is just the point at which I stopped work last
>>> night, so
>>> with your permission I'll steal this and it will save me a good
>>> chunk of
>>> time.
>>>
>> this should be little bit more enhanced to support a row arrays - it
>> can be merged with some routines from pst tool
>> http://okbob.blogspot.com/2010/11/new-version-of-pst-collection-is.html
>>
>>
>
> I will be covering composites.
>
>

OK, here's a patch that does both query_to_json and array_to_json, along
with docs and regression tests. It include Robert's original patch,
although I can produce a differential patch if required. It can also be
pulled from <https://bitbucket.org/adunstan/pgdevel>

A couple of things to note. First, the problem about us losing column
names that I noted a couple of months ago and Tom did a bit of work on
is exercised by this. We really need to fix it. Example:

    andrew=#  select array_to_json(array_agg(row(z.*)))
        from (select $$a$$ || x as b,
                 y as c,
                 array[row(x.*,array[1,2,3]),
                       row(y.*,array[4,5,6])] as z
              from generate_series(1,1) x,
                   generate_series(4,4) y) z;
                                   array_to_json
    -------------------------------------------------------------------------
      [{"f1":"a1","f2":4,"f3":[{"f1":1,"f2":[1,2,3]},{"f1":4,"f2":[4,5,6]}]}]
    (1 row)


Here we've lost b, c and z as column names.

Second, what should be do when the database encoding isn't UTF8? I'm
inclined to emit a \unnnn escape for any non-ASCII character (assuming
it has a unicode code point - are there any code points in the
non-unicode encodings that don't have unicode equivalents?). The
alternative would be to fail on non-ASCII characters, which might be
ugly. Of course, anyone wanting to deal with JSON should be using UTF8
anyway, but we still have to deal with these things. What about
SQL_ASCII? If there's a non-ASCII sequence there we really have no way
of telling what it should be. There at least I think we should probably
error out.

cheers

andrew




Вложения

В списке pgsql-hackers по дате отправления:

Предыдущее
От: Jaime Casanova
Дата:
Сообщение: Re: Measuring relation free space
Следующее
От: Pavel Stehule
Дата:
Сообщение: Re: JSON for PG 9.2