Gabriel Genellina <[EMAIL PROTECTED]> added the comment: Are numbers so special to break the rules? why stopping here? what about other types that may want to accept ASCII bytes instead of characters? Isn't this like going back to the 2.x world?
The protocol with embedded ASCII numbers isn't a very convincing case for me. One can read a binary integer in C using a single function call. In Python 2.X this can't be done in a single call, one has to use struct.unpack to decode the bytes read, and there was no complains that I know of. In 3.0 the same happens for ASCII numbers too, one will have to decode them first. The conversion may look like a stupid step, but it's as stupid as having to use struct.unpack to convert some bits to the *same* bits inside the integer object. Writing int(str(value,'ascii')) doesn't look so terrible. And one may argue that int(b'1234') should return 0x34333231 instead of 1234; b'1234' is the binary representation of 0x34333231 in little-endian format. ---------- nosy: +gagenellina __________________________________ Tracker <[EMAIL PROTECTED]> <http://bugs.python.org/issue2483> __________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com