Thanks for your comments, However, I did not get the point with the table driven against special jumps statement. If you look at the code that quex creates, you see that the jumps are towards fixed labels, and the conditions for those jumps are based on constants. With a table driven approach, you always have at least some addition and multiplication of the type _table_base + (index * sizeof(element)) which is generally much slower than simple comparisons. Maybe, I am mistaken about your argument. Could you elaborate on that issue?
Anyway, is there any type of document that specifies 'pattern-action' pairs as a basis for 'c-lex.c' ? Best Regards, Frank -------- Original-Nachricht -------- Datum: 31 May 2007 17:15:03 +0200 Von: Andi Kleen <[EMAIL PROTECTED]> An: "Frank Schaefer" <[EMAIL PROTECTED]> CC: gcc@gcc.gnu.org Betreff: Re: Very Fast: Directly Coded Lexical Analyzer > "Frank Schaefer" <[EMAIL PROTECTED]> writes: > > > > Is there any interest in using such an engine in the GCC toolset? > > Right now gcc doesn't use flex so it would be probably non trivial > to implement support. You would need to rewrite c-lex.c > > > It would be an honor for me to provide any adaptions you require. > > Anyway, quex's syntax is mostly conform to flex/lex, so there is > > not much 'getting used to' with this generator. I am also positive, > > that it is very hard to program a hand-written analyzer that is faster, > > since the engine does not do any house-keeping and profits from > Hopcroft- > > Optimization and Binary-Search for code generation. These things are > > hard to to by hand. > > Normally switch() uses reasonable algorithms which gcc's c-lex.c uses. > I suspect for common symbols using a pure table lookup vs calling special > code would be faster on many CPUs though (computed jumps are often slow > because they tend to cause branch mispredicts) > > -Andi -- // Dr.-Ing. Frank-René Schäfer, Bodelschwinghstr. 28 // D-50170 Kerpen // Tel.: 49+176/22 02 58 59;