Serhiy Storchaka <storchaka+cpyt...@gmail.com> added the comment:

I can't reproduce.

>>> import tokenize
>>> list(tokenize.generate_tokens(iter(['(\n', r'"\(")']).__next__))
[TokenInfo(type=53 (OP), string='(', start=(1, 0), end=(1, 1), line='(\n'), 
TokenInfo(type=56 (NL), string='\n', start=(1, 1), end=(1, 2), line='(\n'), 
TokenInfo(type=3 (STRING), string='"\\("', start=(2, 0), end=(2, 4), 
line='"\\(")'), TokenInfo(type=53 (OP), string=')', start=(2, 4), end=(2, 5), 
line='"\\(")'), TokenInfo(type=4 (NEWLINE), string='', start=(2, 5), end=(2, 
6), line=''), TokenInfo(type=0 (ENDMARKER), string='', start=(3, 0), end=(3, 
0), line='')]

Could you please provide a minimal script that reproduces your issue?

----------
nosy: +serhiy.storchaka

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue34428>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to