The advice given is sensible. For a timing study see

http://rwiki.sciviews.org/doku.php?id=tips:rqcasestudy

We found that for optimization calculations, putting the objective
function calculation or parts thereof in Fortran was helpful. But we
kept those routines pretty small -- less than a page -- and we just
called them to evaluate things, avoiding passing around information back
and forth to R.

JN




On 13-10-22 06:00 AM, r-help-requ...@r-project.org wrote:
> Message: 58
> Date: Tue, 22 Oct 2013 05:47:15 -0400
> From: Jim Holtman <jholt...@gmail.com>
> To: Alexandre Khelifa <akhel...@logitech.com>
> Cc: "r-help@r-project.org" <r-help@r-project.org>
> Subject: Re: [R] R - How to "physically" Increase Speed
> Message-ID: <73d989da-b6b3-421d-838c-903da3435...@gmail.com>
> Content-Type: text/plain;     charset=us-ascii
> 
> I would start with taking a subset of the data (definitely some that would 
> run in less than 10 minutes) and use the profiler "Rprof" to see where time 
> is being spent.  you can use the the task monitor (if on windows) to see how 
> much memory you are using; it sounds like you did not need the extra memory.
> 
> You might see if you can partition your data so you can run multiple versions 
> of R and then merge the results.
> 
> Anything that takes more than a half hour, for me, is looked into to see 
> where the problems are.  For example dataframes arevexpensive to access and 
> conversion to matrices is one way to speed it up.  the is where the profiler 
> helps.
>

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to