On 02.08.2013 22:14, alina andrei wrote:
Hello,

I am running an R job on a Windows 7 machine, having 4 cores and 16GB RAM , R 
3.0.1, and it takes 1.5 hours to complete.
I am running the same job in R on a Linux enviroment (Platform: 
x86_64-redhat-linux-gnu (64-bit))
with huge amounts of memory: 40 cores and .5 TB RAM., and the job takes 3h and 
15min
to complete (no other concurrent jobs).  The job uses the glmnet package to 
perform model selection on a simulated data set having 1 million records and 
150 variables.
My questions are:
1. Why R doesn't take advantage of the avaialble RAM?
2. Are there any changes that we can apply to the R configuration file in order 
to see superior performance? My expectations are that the Linux enviroment 
would performe a lot better when compared to the Windows enviroment.


Probably the problem has not been parallelized and uses only 1 core? And 1 core on your Linux machine is probably slower than one core of your Windows machine? Or the different machines have different loads?

Uwe Ligges


Any help in sorting out these issues is much appreciated.

Thank you in advance!
Alina
        [[alternative HTML version deleted]]



______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to