On 9/11/2009 9:42 PM, Steven D'Aprano wrote:
However, I must admit I'm perplexed why the original example is calling
__unicode__() in the first place! Given the line:

raise self.severe('Problems with "%s" directive path:\n%s: %s.'
     % (self.name, error.__class__.__name__, error))

it looks to me like it should be calling error.__str__() not
error.__unicode(). Making the suggested edit:

raise self.severe('Problems with "%s" directive path:\n%s: %s.'
     % (self.name, error.__class__.__name__, str(error)))

should have no effect. But it (apparently) does. This brings us back to
Alan's original question:

"MYSTERY: how can "%s"%error be different from "%s"%str(error) in Python
2.6?"


George Brandl explained it to me this way:

        It's probably best explained with a bit of code:

        >>> >>> class C(object):
        ...  def __str__(self): return '[str]'
        ...  def __unicode__(self): return '[unicode]'
        ...
        >>> "%s %s" % ('foo', C())
        'foo [str]'
        >>> "%s %s" % (u'foo', C())
        u'foo [unicode]'
I.e., as soon as a Unicode element is interpolated into a string, further
        interpolations automatically request Unicode via __unicode__, if it 
exists.

Pretty subtle ...

Cheers,
Alan Isaac

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to