Philipp Pagel <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:
> On Wed, Mar 05, 2008 at 12:32:19PM +0100, Erika Frigo wrote: >> My file has not only more than a million values, but more than a >> million rows and moreless 30 columns (it is a productive dataset >> for cows), infact with read.table i'm not able to import it. >> It is an xls file. There is something very wrong here. Even the most recent versions of Excel cannot handle files with a million rows. Heck, they can't even handle files with one-tenth than number. In earlier versions the limit was on the order of 36K. -- David Winsemius > > read.table() expects clear text -- e.g. csv or tab separated in the > case of read.delim(). If your file is in xls format the simplest > option would be to export the data to CSV format from Excel. > > If for some reason that is not an option please have a look at the > "R Data Import/Export" manual. > > Of course neither will solve the problem of not enough memory if > your file is simply too large. In that case you will may want to put > your data into a database and have R connect to it and retrieve the > data in smaller chunks as required. > > cu > Philipp > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.