Mark Dickinson <[EMAIL PROTECTED]> added the comment: Is it worth keeping generate_tokens as an alias for tokenize, just to avoid gratuitous 2-to-3 breakage? Maybe not---I guess they're different beasts, in that one wants a string-valued iterator and the other wants a bytes-valued iterator.
So if I understand correctly, the readline argument to tokenize would have to return bytes instances. Would it be worth adding a check for this, to catch possible misuse? You could put the check in detect_encoding, so that just checks that the first one or two yields from readline have the correct type, and assumes that the rest is okay. ____________________________________ Tracker <[EMAIL PROTECTED]> <http://bugs.python.org/issue719888> ____________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com