Most unclear. My apologies.

I'm trying to structure a tokenizer. The stupid concatenations are
just placeholders for the actual tokenizing work. By rebuilding the
input they demonstrate that the framework correctly processes all the
input.

I'm currently using a C-style design (my own pointers into an array of
strings) but I suspect I've got the solution an old C guy would have.
As a Python newbie I don't yet think in Python.
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to