Hi All:

> Every system has limits.  If you have lots of money, then invest in a
> 64-bit system with 100GB of real memory and you probably won't hit its
> limits for a while.  Otherwise, look at taking incremental steps and
> possibly determining if you can partition the data.  You might
> consider a relational database to sotre the data so that it is easier
> to select a subset of data to process.

netcdf has some very simple mechanisms for reading in part of the data that are 
well implemented in the ncdf and ncdf4 packages.    And it does so very 
quickly.  

Try:

?get.var.ncdf

which will explain how to do so.

-Roy M.


> 
> 
> 
> 2012/3/17 Uwe Ligges <lig...@statistik.tu-dortmund.de>:
>> 
>> 
>> On 17.03.2012 19:27, David Winsemius wrote:
>>> 
>>> 
>>> On Mar 17, 2012, at 10:33 AM, Amen wrote:
>>> 
>>>> I faced this problem when typing:
>>>> 
>>>> temperature <- get.var.ncdf( ex.nc, 'Temperature' )
>>>> 
>>>> *unable to allocate a vector of size 2.8 GB*
>>> 
>>> 

**********************
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**********************
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
1352 Lighthouse Avenue
Pacific Grove, CA 93950-2097

e-mail: roy.mendelss...@noaa.gov (Note new e-mail address)
voice: (831)-648-9029
fax: (831)-648-8440
www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to