On 17/03/2012 20:42, jim holtman wrote:
Another suggestion is to start with a subset of the data file to see
how much memory is required for your processing. One of the
misconceptions is that "memory is free". People think that with
virtual memory and other such tools, that there is no restrict
Many Many thanks for your responses
--
View this message in context:
http://r.789695.n4.nabble.com/memory-i-am-getting-mad-in-reading-climate-data-tp4480671p4481296.html
Sent from the R help mailing list archive at Nabble.com.
__
R-help@r-project.org m
Hi All:
> Every system has limits. If you have lots of money, then invest in a
> 64-bit system with 100GB of real memory and you probably won't hit its
> limits for a while. Otherwise, look at taking incremental steps and
> possibly determining if you can partition the data. You might
> consid
Another suggestion is to start with a subset of the data file to see
how much memory is required for your processing. One of the
misconceptions is that "memory is free". People think that with
virtual memory and other such tools, that there is no restriction on
what you can do. Instead of starti
On 17.03.2012 19:27, David Winsemius wrote:
On Mar 17, 2012, at 10:33 AM, Amen wrote:
I faced this problem when typing:
temperature <- get.var.ncdf( ex.nc, 'Temperature' )
*unable to allocate a vector of size 2.8 GB*
Read the R-Win-FAQ
>
By the way my computer memory is 4G and the ori
On Mar 17, 2012, at 10:33 AM, Amen wrote:
I faced this problem when typing:
temperature <- get.var.ncdf( ex.nc, 'Temperature' )
*unable to allocate a vector of size 2.8 GB*
Read the R-Win-FAQ
By the way my computer memory is 4G and the original size of the
file is
1.4G,netcdf file
I
6 matches
Mail list logo