Re: proposal: auxiliary functions for record type

Поиск
Список
Период
Сортировка
От Pavel Stehule
Тема Re: proposal: auxiliary functions for record type
Дата
Msg-id AANLkTi=E-pVhGDpTLxzJU1V4tXjDa47=0UAEotTV3Jyv@mail.gmail.com
обсуждение исходный текст
Ответ на Re: proposal: auxiliary functions for record type  (Florian Pflug <fgp@phlo.org>)
Список pgsql-hackers
2010/12/12 Florian Pflug <fgp@phlo.org>:
> On Dec12, 2010, at 00:19 , Pavel Stehule wrote:
>> I prefer a table based
>> solution, because I don't need a one "unnest", but other preferences
>> are valid too.
> That's fine with me.
>
>> I dissatisfied with your design of explicit target type
>> via unused value.  I think, so we are not a infrastructure for it now
>> - from my view is better to use a common type, that is text now. It's
>> nothing new - plpgsql use it too.
> Sorry, I can't follow you here. Where does plpgsql use text as "common" type?

plpgsql uses only IO casts. So inside assign statement is checked
target type and real type. But this checking is late! I did a patch
for early conversion to target type (in plan), but this patch was
rejected. So actually, there isn't available information about target
type in expression - and probably will be - from compatibility
reasons. For example: when target variable is int, but you used a
numeric constant, then any assignment does a IO cast from num to int.

>
>> I see one well design of explicit target type based on polymorphic
>> types that respect a PostgreSQL fmgr practice:
>>
>> We have to allow a polymorphic functions without polymorphic
>> parameters. These functions shoud be designed to return value in
>> "unknown" type format when this function has not outer information.
> I don't think "unknown" is the right type for that. As far as I known, "unknown" is still a textual type, used to
havesome type to assign to string literals during parsing when no better type can be inferred. 
>
>> This information can be passed in function context. When function
>> context isn't null, then function has to read target type and should
>> to return value in target type. Who can fill a function context? It is
>> task for executor. And when CAST contains just function call, then we
>> can recheck, if function is polymorphic, and if it is, then we can set
>> function context to target type, and then we don't need to call a
>> conversion function, because polymorphic function must returns data in
>> correct format.
> The main difficulty is that currently types are assigned in a bottom-up fashion as far as I know. To make functions
witha polymorphic return value, but without polymorphic arguments work, you need to assign the return type in a
top-downfashion (It depends on where to value *goes*, not where it *comes from*). That seems like a rather huge change
andhas the potential to complicate quite a few other parts, most notably function lookup/resolution. 

I don't think:
a) the place where we don't know a target type is limited only to
first outer cast
b) I didn't defined polymorphic function without polymorphic
parameters (PFWPP) as absolutly undescribed - it returns a "unknown"
or "text" in default. There isn't problem to search a this function -
and isn't a problem for later work, so this function returns "text",
because first outer cast ensure transformation to correct type.
c) when function is called without outer cast then it runs too - but
there will be one IO cast more.

some alchemy with function descriptor ale used now too - when default
parameters are used.

>
> Plus, the general case where type information must bubble up more than one level seems pretty much intractable, as
it'drequire a full-blown type inference algorithm like ML or Haskell. Not a place where we want to go, I believe. 
>
> The restricted case, on the other hand, brings very little benefit compared to the dummy-parameter approach. Yeah,
"<polymorphicfunction>()::type" may look a bit cleaner than "<polymorphic function>(NULL::type)", but thats about is.
It'sonly assignments in pl/pgsql which really benefit, since you'd be able to leave out the type completely, writing
simply"v_value := <polymorphic_function>()". Does that really warrant the effort that'd be involved? 
>
>> Without described functionality we can design a not polymorphic
>> function, that can returns unknown type. When similar functionality
>> will be implemented, then this function will be changed to
>> polymorphic, but from user's perspective, there isn't a change.

> I don't really understand why you resist the idea of a dummy parameter so much. It might not be pretty, but is it bad
enoughto rectify putting in all this work? Plus, the whole record-manipulation stuff isn't going to win a beauty
contestanytime soon. But it's better than nothing, so as long as it's reasonably efficient I think one can live with a
fewwarts on the API. 

I wrote it. In this case, you don't need to know a value, you have to
work with type. So using a typed null isn't intuitive and it isn't
nice - for me - too ugly for in general module. I know, so PFWPP
functions need a lot of coding without sure result, and it's reason,
why I didn't used it and why I use a "text" type. And I have a other
reason for - I expect so there is bigger probability to iterate over
different type's fields, so coercion to one target type isn't
available in one path. Using a more path (like you are showed in code)
is relative "high" technique - there is a few developers who can use
it and understand. So my design is oriented more to typical
programmers who doesn't know a plpgsql implementation details. These
people can use a some strange solution based on dynamic sql and are
happy, so they has a workable code. It's question what design is more
useful - I don't know. But I am strong against to add some strange API
to pg.

Regards

Pavel


>
> best regards,
> Florian Pflug
>
>


В списке pgsql-hackers по дате отправления:

Предыдущее
От: Alexander Korotkov
Дата:
Сообщение: Re: Wildcard search support for pg_trgm
Следующее
От: Martijn van Oosterhout
Дата:
Сообщение: Re: proposal : cross-column stats