Łukasz Langa <luk...@langa.pl> added the comment:

### Diff between files

The unified diff between tokenize implementations is here:
https://gist.github.com/ambv/679018041d85dd1a7497e6d89c45fb86

It clocks at 275 lines but that's because it gives context. The actual diff is
175 lines long.

To make it that small, I needed to move some insignificant bits in
Lib/tokenize.py.  This is what the other PR on this issue is about.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue33338>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to