Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?

Поиск
Список
Период
Сортировка
От Hannu Krosing
Тема Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?
Дата
Msg-id 1335888731.3106.137.camel@hvost
обсуждение исходный текст
Ответ на Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?  (Joey Adams <joeyadams3.14159@gmail.com>)
Список pgsql-hackers
On Tue, 2012-05-01 at 11:49 -0400, Joey Adams wrote:
> On Tue, May 1, 2012 at 8:02 AM, Hannu Krosing <hannu@2ndquadrant.com> wrote:
> > Hi hackers
> >
> > After playing around with array_to_json() and row_to_json() functions a
> > bit it I have a question - why do we even have 2 variants *_to_json()
> 
> Here's the discussion where that decision was made:
> 
> http://archives.postgresql.org/pgsql-hackers/2012-01/msg01339.php

Thanks, will read it!

> To quote:
> 
> >>> why not call all these functions 'to_json' and overload them?
> >>
> >> I don't honestly feel that advances clarity much. And we might want 
> to overload each at some stage with options that are specific to the 
> datum type. We have various foo_to_xml() functions now.
> >
> > -1
> >
> > older proposal is more consistent with xml functions
> 
> The most compelling argument I see here is the one about options
> specific to the datum type.
> 
> Two other reasons I can think of:
> 
>  * If someone tries to google for how to convert an array to JSON,
> having a function named 'array_to_json' will make that easier.

Well, if you want to know how to convert an integer to string, you don't
use integer_to_text() function. you just use a working cast.

and here it is an outright lie:

hannu=# select 1::json;
ERROR:  cannot cast type integer to json
LINE 1: select 1::json;

the error should be "won't cast type integer to json" :)

It very well _can_ convert it, as it does it without a problem when such
integer is inside an array or a record type.

>  * If the JSON type does not yet support, say, converting from a
> number, it will be apparent from the names and types of the functions,
> rather than being a hidden surprise.  On the other hand, array_to_json
> and composite_to_json already convert ANY values to JSON, so this
> doesn't matter, anyway.

By this logic all non-working casts are "hidden surprises"

> On Tue, May 1, 2012 at 11:02 AM, Hannu Krosing <hannu@2ndquadrant.com> wrote:
> > What we currently lack is direct conversion for simple types, though
> > they are easily achieved by converting to a single-element array and
> > then stripping outer [] from the result
> 
> I agree that this function ought to be exposed.  Note that such a
> function (perhaps called datum_to_json) is indeed the same as the
> proposed to_json function, which tries to convert a value of any type
> to JSON.

Hmm, I just have found an answer to my question on how to add to_json()
capability to extension types .

If whe had to_json as a cast, it would probably be straightforward for
extensions like hstore to provide their own to_json casts - especially
now that the json type is in core - and we could get a working hstore
--> json conversion by just running the 

CREATE EXTENSION hstore;

command.

> > It would be really nice to also have the casts from json to any type,
> > including records though.
> 
> What the casts currently do (primarily) is convert between the TEXT
> and JSON types.  So if you have JSON-encoded TEXT, use a cast to
> convert it to the JSON type (this will perform validation, ensuring
> that no invalid JSON gets in).  Any escape/unescape operations need to
> be explicit.
> 
> -Joey

-- 
-------
Hannu Krosing
PostgreSQL Unlimited Scalability and Performance Consultant
2ndQuadrant Nordic
PG Admin Book: http://www.2ndQuadrant.com/books/



В списке pgsql-hackers по дате отправления:

Предыдущее
От: Merlin Moncure
Дата:
Сообщение: Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?
Следующее
От: Robert Haas
Дата:
Сообщение: Re: extending relations more efficiently