Hi,
On Aug 4, 2009, at 11:20 AM, Hyo Karen Lee wrote:
Hi,
I have one critical question in using R.
I am currently working on some research which involves huge amounts
of data(it is about 15GB).
I am trying to use R in this research rather than using SAS or STATA.
(The company where I am working right now, is trying to switch SAS/
STATA to
R)
As far as I know, the memory limit in R is 4GB;
While that might be true on windows(?), I'm pretty/quite (positively,
even) sure that's not true on 64bit linux/osx.
However, I believe that there are ways to handle the large dataset.
Most of my works in R would be something like cleaning the data or
running a
simple regression(OLS/Logit) though.
One place to look would be the bigmemory package:
http://cran.r-project.org/web/packages/bigmemory/
As well as the other packages listed in the High Performance Computing
view on CRAN:
http://cran.r-project.org/web/views/HighPerformanceComputing.html
Specifically the "Large memory and out-of-memory data" section.
-steve
--
Steve Lianoglou
Graduate Student: Computational Systems Biology
| Memorial Sloan-Kettering Cancer Center
| Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.