Hello all,

I hate to add to the daily queries regarding R's handling of large
datsets ;), but...

I read in an online powerpoint about the ff package something about
the "length of an ff object" needing to be smaller than
.Machine$integer.max. Does anyone know if this means that the # of
elements in an ff object must be < .Machine$integer.max [i.e., that ff
provides no help with respect to the number of elements in a given
object]? I've got a matrix that has 19e9 elements and - even though I
can fit it into my ram (using "raw" storage.mode) - R won't let me
store it because 19e9 is >> .Machine$integer.max = 2^31.

Anyone else have suggestions on how to deal with such massive datasets
like the ones I'm using? I'm exploring ncdf as we speak. Best,

Matt



-- 
Matthew C Keller
Asst. Professor of Psychology
University of Colorado at Boulder
www.matthewckeller.com

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to