On Wed, 2009-09-02 at 08:57 -0500, Kevin Grittner wrote:
> >   (a) leaving a literal as "unknown" until you've finished
> >       inferring types (current behavior)
> >   (b) casting every unknown to text immediately, and then trying to 
> >       infer the types
>  
> No, that's not it.  I'm wondering why it isn't treated as text. 
> Period.  Full stop.  Nothing to infer.  Anywhere that we have implicit
> casts defined from text to something else could, of course, still
> operate; but it would be text.  No guessing.

If you have very many implicit casts, I think you lose the
predictability and safety you're looking for, and/or end up with a lot
of errors that eliminate the convenience of implicit casting.
 
> It often seems to have the opposite effect.  See the original post.

The original problem has more to do with the fact that interpreting an
unknown value as a char seems to just discard a lot of information. I
assume that's part of the standard, but it seems like a bad idea any
time you silently discard data (which is why we prevented varchar(n)
from silently truncating a while ago).

> Here I think you have answered my question.  It is seen as a feature,
> since it allows people to avoid the extra keystrokes of coding
> type-specific literal values, and allows them the entertainment of
> seeing how the values get interpreted.  :-)
>  
> > But you can't have both of those desirable behaviors
>  
> Whether they are desirable is the point of disagreement.  At least I
> now understand the reasoning.

They are desirable for a system that infers types from context. I agree
that there's more safety by explicitly declaring the type of all
literals; but I disagree that using implicit casts to make up for a lack
of an "unknown" type will improve matters (either for convenience or
safety).

Regards,
        Jeff Davis


-- 
Sent via pgsql-bugs mailing list (pgsql-bugs@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-bugs

Reply via email to