And, by the way, factors take up _more_ memory than character vectors. > object.size(sample(c("a","b"), 1000, replace=TRUE)) 4088 bytes > object.size(factor(sample(c("a","b"), 1000, replace=TRUE))) 4296 bytes
On Mon, Sep 14, 2009 at 11:35 PM, jim holtman <jholt...@gmail.com> wrote: > When you read your file into R, show the structure of the object: > > str(tab) > > also the size of the object: > > object.size(tab) > > This will tell you what your data looks like and the size taken in R. > Also in read.table, use colClasses to define what the format of the > data is; may make it faster. You might want to force a garbage > collection 'gc()' to see if that frees up any memory. If your input > is about 2M lines and it looks like there are three column (alpha, > numeric, numeric), I would guess that you will probably have an > object.size of about 50MB. This information would help. > > > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.