Re: [R] memory, i am getting mad in reading climate data

2012-03-18 Thread Prof Brian Ripley
On 17/03/2012 20:42, jim holtman wrote: Another suggestion is to start with a subset of the data file to see how much memory is required for your processing. One of the misconceptions is that "memory is free". People think that with virtual memory and other such tools, that there is no restrict

Re: [R] memory, i am getting mad in reading climate data

2012-03-17 Thread Amen
Many Many thanks for your responses -- View this message in context: http://r.789695.n4.nabble.com/memory-i-am-getting-mad-in-reading-climate-data-tp4480671p4481296.html Sent from the R help mailing list archive at Nabble.com. __ R-help@r-project.org m

Re: [R] memory, i am getting mad in reading climate data

2012-03-17 Thread Roy Mendelssohn
Hi All: > Every system has limits. If you have lots of money, then invest in a > 64-bit system with 100GB of real memory and you probably won't hit its > limits for a while. Otherwise, look at taking incremental steps and > possibly determining if you can partition the data. You might > consid

Re: [R] memory, i am getting mad in reading climate data

2012-03-17 Thread jim holtman
Another suggestion is to start with a subset of the data file to see how much memory is required for your processing. One of the misconceptions is that "memory is free". People think that with virtual memory and other such tools, that there is no restriction on what you can do. Instead of starti

Re: [R] memory, i am getting mad in reading climate data

2012-03-17 Thread Uwe Ligges
On 17.03.2012 19:27, David Winsemius wrote: On Mar 17, 2012, at 10:33 AM, Amen wrote: I faced this problem when typing: temperature <- get.var.ncdf( ex.nc, 'Temperature' ) *unable to allocate a vector of size 2.8 GB* Read the R-Win-FAQ > By the way my computer memory is 4G and the ori

Re: [R] memory, i am getting mad in reading climate data

2012-03-17 Thread David Winsemius
On Mar 17, 2012, at 10:33 AM, Amen wrote: I faced this problem when typing: temperature <- get.var.ncdf( ex.nc, 'Temperature' ) *unable to allocate a vector of size 2.8 GB* Read the R-Win-FAQ By the way my computer memory is 4G and the original size of the file is 1.4G,netcdf file I