I know that custom types account for a portion of overhead, and I'm
not by any means advocating their removal. I also know that the
efficiency of postgres has improved greatly since the early days, and
I'm wondering what more can be done.
For instance, would it be possible to cache results of the
input/output functions for the types? i.e. if we've already called
foobar_out for a peice of data, why call it again? We could store the
previous result in a hash, and then use that.
Note that I next to nothing about how the query node tree gets
executed (I'm reading up on it now) so this may not be possible or
could even introduce extra overhead.
I'd like to get postgres up to speed. I know it is a great database,
and I tell all my friends this, but there is too much pg bashing
because of the early days. People think mysql rocks because it is so
fast, but in reality, well.. It's all IMHO, and the right tool for
the right job.
So my real question is: have we hit the limit on optimization and
reduction of overhead, or is there more work to be done? Or should we
concentrate on other aspects such as inheritance issues? I'm not
quite as interested in ANSI compliance.
--brett