Try
RSiteSearch("biglm") for some threads that discuss strategy for analyzing big datasets. HTH, Chuck On Fri, 26 Sep 2008, zerfetzen wrote:
Hi, I'm sure that a large fixed width file, such as 300 million rows and 1,000 columns, is too large for R to handle on a PC, but are there ways to deal with it? For example, is there a way to combine some sampling method with read.fwf so that you can read in a sample of 100,000 records, for example? Something like this may make analysis possible. Once analyzed, is there a way to, say, read in only x rows at a time, save and score each subset separately, and finally append them back together? I haven't seen any information on this, if it is possible. Thank you for reading, and sorry if the information was easily available and I simply didn't find it. -- View this message in context: http://www.nabble.com/Dealing-With-Extremely-Large-Files-tp19695311p19695311.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Charles C. Berry (858) 534-2098 Dept of Family/Preventive Medicine E mailto:[EMAIL PROTECTED] UC San Diego http://famprevmed.ucsd.edu/faculty/cberry/ La Jolla, San Diego 92093-0901 ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.