Robert Lehmann schrieb:

You don't have to introduce a `next` method to your Lexer class. You could just transform your `tokenize` method into a generator by replacing ``self.result.append`` with `yield`. It gives you the just in time part for free while not picking your algorithm into tiny unrelated pieces.

Python generators recently (2.5) grew a `send` method. You could use `next` for unconditional tokenization and ``mytokenizer.send("expected token")`` whenever you expect a special token.

> See http://www.python.org/dev/peps/pep-0342/ for details.

I will try this. Thank you for the suggestion.

Greetings,
Thomas

--
Ce n'est pas parce qu'ils sont nombreux à avoir tort qu'ils ont raison!
(Coluche)
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to