Can you expose a practical example of this? Maybe taking use cases as
design roots we can catch a better idea of how to implement this or
how to solve the problem you are facing.
--pancake
Maurício wrote:
(2/2 - I believe these messages didn't went to
the list. Sorry if they actually did.)
(...) If you had such "shell yacc", how would you like it to be
or behave?
(...) So the important thing is being able to whip something up
quickly; this isn't parser "specs" that's going to be carefully
developed and then used for a very long time.
Sure. I want something that helps testing and can deal with
complex input, or even input with unknown structure to which you
want to check if one works, even if temporarily. Example: someone
gives you some unorganized data and you just want to transform it
into something you can deal with.
A general point: one of the most important things to think
about, particularly with parsers, is what would be most
effective in tracking down the inevitable problems when there's
a bug in the user input and/or mismatched input, particularly if
it happens in the middle of a pipe process: how are you going to
report which part of the input stream was wrong, particularly if
it doesn't exist on its own, in a way which is effective for a
human to track down the problem? (...)
The exact answer will probably depend on the chosen grammar type
and parsing algorithm. Allowing specified limits on match size or
deepness of analysis we could get error logs to be readable.
However, "errors" in these tools should not be errors in a strict
sense. I do want to write a tool that you can use to check grammar
hyphotesis on text, and that means that even if you don't get
fatal errors you still want to know how well your grammar did with
some input, and get meaninfull report on, say, how long a match
had to be to solve ambiguity, how deep an analysis had to be to
find a match, which false matches were more common etc, and you
want this report to be good for automated analysis.
Best,
Maurício