Re: Newbie design problem

2007-12-17 Thread Jonathan Gardner
On Dec 14, 8:02 am, [EMAIL PROTECTED] wrote: > > Lex is very crude. I've found that it takes about half a day to > organize your token definitions and another half day to write a > tokenizer by hand. What's the point of the second half-day's work? > As someone who has earned a BS in Physics, I hav

Re: Newbie design problem

2007-12-14 Thread Bruno Desthuilliers
[EMAIL PROTECTED] a écrit : > > Bruno Desthuilliers wrote: > >>Then the first move is to carefully eval existing solutions: >>http://wiki.python.org/moin/LanguageParsing > > > Always good advice, Bruno. How did you come by that list address? > Google or is there something special known to Pytho

Re: Newbie design problem

2007-12-14 Thread MartinRinehart
Bruno Desthuilliers wrote: > Then the first move is to carefully eval existing solutions: > http://wiki.python.org/moin/LanguageParsing Always good advice, Bruno. How did you come by that list address? Google or is there something special known to Python experts? -- http://mail.python.org/mailm

Re: Newbie design problem

2007-12-14 Thread Bruno Desthuilliers
[EMAIL PROTECTED] a écrit : > Most unclear. My apologies. > > I'm trying to structure a tokenizer. The stupid concatenations are > just placeholders for the actual tokenizing work. By rebuilding the > input they demonstrate that the framework correctly processes all the > input. > > I'm currently

Re: Newbie design problem

2007-12-14 Thread Bruno Desthuilliers
[EMAIL PROTECTED] a écrit : > Jonathan Garnder said: > >> Well, if using something like PLY ( http://www.dabeaz.com/ply/ ) is >> considered more Pythonic than writing your own parser and lexer... > > Lex is very crude. Possibly. Anyway, there are quite a few other parser generators : http://wik

Re: Newbie design problem

2007-12-14 Thread MartinRinehart
Jonathan Garnder said: > Well, if using something like PLY ( http://www.dabeaz.com/ply/ ) is > considered more Pythonic than writing your own parser and lexer... Lex is very crude. I've found that it takes about half a day to organize your token definitions and another half day to write a tokeniz

Re: Newbie design problem

2007-12-14 Thread MartinRinehart
Most unclear. My apologies. I'm trying to structure a tokenizer. The stupid concatenations are just placeholders for the actual tokenizing work. By rebuilding the input they demonstrate that the framework correctly processes all the input. I'm currently using a C-style design (my own pointers int

Re: Newbie design problem

2007-12-13 Thread John Machin
On Dec 14, 6:32 am, [EMAIL PROTECTED] wrote: > Thanks to a lot of help, I've got the outer framework for my tokenizer > down to this: > > for line_number, line in enumerate(text): > output = '' > > for char_number, char in enumerate(line): > output += char > >

Re: Newbie design problem

2007-12-13 Thread Jonathan Gardner
On Dec 13, 11:32 am, [EMAIL PROTECTED] wrote: > Is there a pythonic design I'm overlooking? Well, if using something like PLY ( http://www.dabeaz.com/ply/ ) is considered more Pythonic than writing your own parser and lexer... Python doesn't have all of life's answers unfortunately. -- http://m

Newbie design problem

2007-12-13 Thread MartinRinehart
Thanks to a lot of help, I've got the outer framework for my tokenizer down to this: for line_number, line in enumerate(text): output = '' for char_number, char in enumerate(line): output += char print 'At ' + str(line_number) + ', '+ str(char_number) + ':