I can confirm the bug exists in the `master` branch as well and
doesn't depend on the platform.
Although the bug is easy to fix for this particular case (see the
patch) I'm not sure if this solution is general enough. E.g. is there
something that generally prevents pg_mblen() from doing out of bound
reading in cases similar to this one? Should we prevent such an INSERT
from happening instead?
Not just INSERTs, I would think: the implicit cast is already invalid, since the "char" type can only hold characters that can be represented in 1 byte. A comparable example in the numeric types might be:
odyssey=> select (2.0 ^ 80)::double precision::integer;
ERROR: integer out of range
By comparison:
odyssey=> select '🀆'::"char";
char
──────
(1 row)
I think this should give an error, perhaps 'ERROR: "char" out of range'.
Incidentally, if I apply ascii() to the result, I get sometimes 0 and sometimes 90112, neither of which should be a possible value for ascii () of a "char" value and neither of which is 126982, the actual value of that character.
odyssey=> select ascii ('🀆'::"char");
ascii
───────
90112
(1 row)
odyssey=> select ascii ('🀆'::"char");
ascii
───────
0
(1 row)
odyssey=> select ascii ('🀆');
ascii
────────
126982
(1 row)