Hongyi Zhao wrote: > Hi, > > I have some code comes from python 2 like the following: > > str('a', encoding='utf-8')
This fails in Python 2 >>> str("a", encoding="utf-8") Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: str() takes at most 1 argument (2 given) ...unless you have redefined str, e. g. with >>> str = unicode >>> str("a", encoding="utf-8") u'a' > But for python 3, this will fail as follows: > >>>> str('a', encoding='utf-8') > Traceback (most recent call last): > File "<input>", line 1, in <module> > TypeError: decoding str is not supported > > > How to fix it? Don' try to decode an already decoded string; use it directly: "a" -- https://mail.python.org/mailman/listinfo/python-list