Reading a flat text file 138 Mbyte large into R with a combination of scan (to get the header) and read.table. After conversion of text time stamps to POSIXct and conversion of integer codes to factors I convert everything into one data frame and release the old structures containing the data by using rm().

Strangely, the rm() does not appear to reduce the used memory. I checked using memory.size(). Worse still, the amount of memory required grows. When I save an image the .RData image file is only 23 Mbyte, yet at some point in to the program, after having done nothing particularly difficult (two and three way frequency tables and some lattice graphs) the amount of memory in use is over 1 Gbyte.

Not yet a problem, but it will become a problem. This is using R2.10.0 on Windows Vista.

Does anybody know how to release memory as rm(dat) does not appear to do this properly.

Regards,
Alex van der Spek

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to