New submission from Windson Yang <wiwind...@outlook.com>:

> The tokenize() generator requires one argument, readline, which must be a 
> callable object which provides the same interface as the io.IOBase.readline() 
> method of file objects. Each call to the function should return one line of 
> input as bytes.

Add an example like this should be easier to understand:

# example.py
def foo:
    pass

# tokenize_example.py
import tokenize
f = open('example.py', 'rb')
token_gen = tokenize.tokenize(f.readline)

for token in token_gen:
    # Something like this
    # TokenInfo(type=1 (NAME), string='class', start=(1, 0), end=(1, 5), 
line='class Foo:\n')
    # TokenInfo(type=1 (NAME), string='Foo', start=(1, 6), end=(1, 9), 
line='class Foo:\n')
    # TokenInfo(type=53 (OP), string=':', start=(1, 9), end=(1, 10), 
line='class Foo:\n')
    print(token)

----------
assignee: docs@python
components: Documentation
messages: 340467
nosy: Windson Yang, docs@python
priority: normal
severity: normal
status: open
title: Add example to tokenize.tokenize
type: enhancement
versions: Python 3.5, Python 3.6, Python 3.7, Python 3.8

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue36654>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to