hadley wickham wrote:
At the moment, I am concentrating efforts deep down in the parser code, but
there are other challenges:
- once the expressions are parsed, we will need something that investigates
to find evidence about function calls, to get an idea of where the function
is defined (by the user, in a package, ...) . This is tricky, and unless you
actually evaluate the code, there will be some errors made.

Are you aware of Luke Tierney's codetools package?  That would seem to
be the place to start.
Yep. Plan to combine the more verbose information out of the modified parser with the same guess machine that checkUsage uses. Another side effect is that we could imagine to link error patterns identified by checkUsage (no visible binding for global variable "y", ...) to actual locations on the file (for example the place where the variable y is used in that case ), which at the moment is not possible because the parser only locates entire expression (semantic groupings) and not tokens.

> f <- function( x = 2) {
+ y + 2
+ }
> checkUsage( f )
<anonymous>: no visible binding for global variable ‘y’

Hadley



--
Romain Francois
Independent R Consultant
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to