Greetings,

I have a class that implements the iterator protocol, and tokenises a
string into a series of tokens. As well as the token, it keeps track of
some information such as line number, source file, etc.

for tokens in Tokeniser():
  do_stuff(token)

What I want is to be able to wrap the tokeniser to add functionality to the
base parser without subclassing, e.g.

for tokens in processor(Tokeniser()):
  do_stuff(token)

Sort of Decorator pattern, so that I can chain more processors,  but I
cannot think how to implement it. Any clues for me?

Thanks

TomH
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to