Hi Duncan and list I hope it is clear from my previous mails, but to make it sure again, I am talking about R on Windows. So here's what I did. I installed the Rtools on my home machine last night, then copied the files and moved them to my work machine today. I set up the PATH environment variable to include the RTools directory, Perl and MinGW. I did not install any of the additional packages - TeX, CHM compiler, Inno Setup. I successfully <but with some warnings> ran R CMD build, R CMD check and R CMD INSTALL on the package "mypkg" as made by the exmaple in 'package.skeleton' function. After the R CMD INSTALL the package was in R/library/mypkg and could be loaded with library(mypkg). I zipped the directory structure in R/library/mypkg and removed the package by remove.packages("mypkg"). Then I could again reinstall the package from the "binary" zip file from the GUI menu. Although the CHM files are missing the library integrated very well with the rest of and the help system. I can see that many of the files, even R-script files, in the "compiled" directory structure are modified with added comments from the help files, etc. There are also the MD5 checks etc, etc. all very good and useful stuff. I really do appreciate it. Thanks!
Now, may I lay out a couple of suggestions or rather, personal wish-list items for R. First about RTools. Is it possible to make R CMD INSTALL etc. read the path where RTools are installed from an ini file instead of the environment variable. Then the whole R and Rtools could be easily moved around regardless of registry settings an environment variables. And this is a GOOD THING. It saves a lot of time on install-reinstall. Maybe it even would be possible to make RTools an R package like all the other packages on CRAN? In this way it will not be included in the R binary, but will be a click away for those who need it. There may be licensing issues with this, I don't know... Second, about the general approach to user function management. Currently we can "source" R code or load "packaged" R code. Sourcing is equivalent to typing the source from the keyboard but takes input from a file. It is a perfect solution for a couple of functions and simple step-by-step analysis. When there are more than 10-20 user functions that are sourced this clutters the workspace. The other option is to "package" the source. The process takes care of the error checking and help-system integration. When the package is built it can be installed and loaded as a library, which integrates with the R help system. I feel that there is a gap between the two, which could be filled for the benefit of users, who write large amount of custom code, which, however, is not intended for distribution as a package. In my case I have a large number of files dealing with massaging data files from an in-house made instrument that will not be of use to anyone else. This gap could be filled, if it would be possible to add the (pure R) functions from a certain user directory to the search path and execute them, whenever they are called. Sourcing a whole directory could also be an option but it would create two problems - clutter in the workspace and memory usage for functions, which may not be called actually. Well, I am not a developer and probably I don't know what I am talking about. This is just some feedback in case it might be useful for the wonderful people that make R. For the time being I will stick with one of the machines setup for R Development (or more appropriately "R packaging") while the others will be run only. Thanks for helping, TL ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.