Обсуждение: Should cast to CHAR or NUMERIC enforce default length limit?
Now that I've modified the code so that casting to a specific length actually works --- ie you can dox :: char(7)CAST (y AS numeric(40,6)) and get the expected results --- I am starting to worry that there may be unwanted side-effects. The reason is that the system by default interprets "char" as "char(1)" and "numeric" as "numeric(30,6)". So if you just write "x::char" you will now get truncation to one character, which did not use to happen. Another distressing example is regression=# select '123456789012345678901234567890.12'::numeric; ERROR: overflow on numeric ABS(value) >= 10^29 for field with precision 30 scale 6 which I think is arguably a violation of the SQL standard --- it says pretty clearly that the precision and scale of a numeric constant are whatever is implicit in the number of digits. I am inclined to think that in the context of a cast, we shouldn't enforce a coercion to default length, but should only coerce if a length is explicitly specified. This would change the behavior of "x::char" back to what it was. I think this could be done by having gram.y insert -1 as the default typmod for a "char" or "numeric" Typename. The rest of the system already interprets such a typmod as specifying no particular length constraint. Then, to preserve the rule thatcreate table foo (bar char); creates a char(1) field, analyze.c would have to be responsible for inserting the appropriate default length in place of -1 when processing a column definition. Comments? Better ideas? regards, tom lane
> I think this could be done by having gram.y insert -1 as the default > typmod for a "char" or "numeric" Typename. The rest of the system > already interprets such a typmod as specifying no particular length > constraint. Then, to preserve the rule that > create table foo (bar char); > creates a char(1) field, analyze.c would have to be responsible for > inserting the appropriate default length in place of -1 when processing > a column definition. Sounds good. My first inclination was to work this out in gram.y, which you could do pretty easily for TYPECAST rules, but perhaps it would be better to isolate *all* of the default size settings to a separate routine called from the appropriate place. Eventually, this could even be a tunable parameter settable during the session (e.g. "SET DEFAULT PRECISION NUMERIC ..."). - Thomas -- Thomas Lockhart lockhart@alumni.caltech.edu South Pasadena, California
> Now that I've modified the code so that casting to a specific length > actually works --- ie you can do > x :: char(7) > CAST (y AS numeric(40,6)) > and get the expected results --- I am starting to worry that there > may be unwanted side-effects. The reason is that the system by default > interprets "char" as "char(1)" and "numeric" as "numeric(30,6)". > So if you just write "x::char" you will now get truncation to one > character, which did not use to happen. Another distressing example > is > regression=# select '123456789012345678901234567890.12'::numeric; > ERROR: overflow on numeric ABS(value) >= 10^29 for field with precision 30 scale 6 > which I think is arguably a violation of the SQL standard --- it says > pretty clearly that the precision and scale of a numeric constant are > whatever is implicit in the number of digits. Yes, this is distressing. > > I am inclined to think that in the context of a cast, we shouldn't > enforce a coercion to default length, but should only coerce if a length > is explicitly specified. This would change the behavior of "x::char" > back to what it was. > > I think this could be done by having gram.y insert -1 as the default > typmod for a "char" or "numeric" Typename. The rest of the system > already interprets such a typmod as specifying no particular length > constraint. Then, to preserve the rule that > create table foo (bar char); > creates a char(1) field, analyze.c would have to be responsible for > inserting the appropriate default length in place of -1 when processing > a column definition. Sounds good to me. -- Bruce Momjian | http://www.op.net/~candle pgman@candle.pha.pa.us | (610) 853-3000+ If your life is a hard drive, | 830 Blythe Avenue + Christ can be your backup. | Drexel Hill, Pennsylvania19026
Thomas Lockhart <lockhart@alumni.caltech.edu> writes: >> analyze.c would have to be responsible for >> inserting the appropriate default length in place of -1 when processing >> a column definition. > Sounds good. My first inclination was to work this out in gram.y, > which you could do pretty easily for TYPECAST rules, I thought about that, but since the Typename production itself can't do the right thing (it doesn't know its context), you'd have to patch up after it in either TYPECAST or column definition rules. On the whole, fixing things that the grammar can't easily get right seems like it belongs in analyze.c. > Eventually, this > could even be a tunable parameter settable during the session (e.g. > "SET DEFAULT PRECISION NUMERIC ..."). As I'm envisioning it, the only thing the default will be used for is substituting a default precision into a "CREATE TABLE foo (bar numeric)" command. So adjusting the default on-the-fly doesn't seem all that useful --- it wouldn't affect previously-created tables. I did speculate about the idea of not enforcing a default precision at all. If we allowed the table to be created with atttypmod = -1, then the effect would be that numeric values would get stored with whatever precision appeared in the source value or came out of a calculation. Such a numeric column would have no specific precision/scale, so it'd be more like variable-width "text" than fixed-width "char". I'm not sure if this is useful, and I am sure that it wouldn't be very standard... regards, tom lane