On 3 Aug 2013, at 21:03, Jason Dagit wrote:

> Another con of using parsec that I forgot to mention in my previous
> email is that with Parsec you need to be explicit about backtracking
> (use of try). Reasoning about the correct places to put try is not
> always easy and parsec doesn't help you with the task. In my
> experience, this is the main bug that people run into when using
> parsec.

Although the original question did not mention parsec explicitly, I find it 
disappointing that many people immediately think of it as the epitome of 
monadic combinator parsing.  The power of good marketing, eh?  There are so 
many other good parsing libraries out there.  Parsec happened to cure some 
known space-leaks in rival libraries about the time of its release (2000 or 
so), but the main reason it is popular is simply because it was distributed 
alongside ghc for a long time.

Curiously enough, the complaint you make about parsec is exactly the same as 
the observation that drove the development of my own set of combinators - 
polyparse.  But the concept of commitment to a partial parse (to prevent 
backtracking) was already present somewhat in Röjemo's applicative parsers way 
back in 1994.  He had both `ap` and `apCut` (the naming of "cut" borrowed from 
Prolog I suppose).  Space performance and the elimination of space leaks was 
totally his focus: he mentions being able to recompile the compiler itself in 
just 3Mb of RAM.

Regards,
    Malcolm
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to