Hundreds of thousands of records usually fit into memory fine. Hadley
On Tue, Sep 16, 2014 at 12:40 PM, Barry King <[email protected]> wrote: > Is there a way to get around R’s memory-bound limitation by interfacing > with a Hadoop database or should I look at products like SAS or JMP to work > with data that has hundreds of thousands of records? Any help is > appreciated. > > -- > __________________________ > *Barry E. King, Ph.D.* > Analytics Modeler > Qualex Consulting Services, Inc. > [email protected] > O: (317)940-5464 > M: (317)507-0661 > __________________________ > > [[alternative HTML version deleted]] > > ______________________________________________ > [email protected] mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. -- http://had.co.nz/ ______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.

