Hi, On Thu, Mar 8, 2012 at 1:19 PM, RHelpPlease <rrum...@trghcsolutions.com> wrote: > Hi there, > I wish to read a 9.6GB .DAT file into R (64-bit R on 64-bit Windows machine) > - to then delete a substantial number of rows & then convert to a .csv file. > Upon the first attempt the computer crashed (at some point last night). > > I'm rerunning this now & am closely monitoring Processor/CPU/Memory. > > Apart from this crash being a computer issue alone (possibly), is R equipped > to handle this much data? I read up on the FAQs page that 64-bit R can > handle larger data sets than 32-bit. > > I'm using the read.fwf function to read in the data. I don't have access to > a database program (SQL, for instance).
Keep in mind that sqlite3 is just a `install.packages('RSQLite')` away ... and this SO thread might be useful w.r.t sqlite performance and big db files: http://stackoverflow.com/questions/784173 HTH, -steve -- Steve Lianoglou Graduate Student: Computational Systems Biology | Memorial Sloan-Kettering Cancer Center | Weill Medical College of Cornell University Contact Info: http://cbio.mskcc.org/~lianos/contact ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.