Mike Marchywka <marchywka <at> hotmail.com> writes:

> Duncan Murdoch:
> > Vectors are limited to about 2 billion entries (2^31 - 1). Matrices are
> > vectors, so that limit applies to the total count of entries.
> > Dataframes are lists of vectors, so that limit applies separately to the
> > numbers of rows and columns.
> >
> > Simple R code keeps everything in memory, so you're likely to run into
> > hardware limits if you start working with really big vectors. There are
> > a number of packages that alleviate that by paging data in and out, but
> > it takes a bit of work on your part to use them. As far as I know,
> 
> Do you have more details here? 

 [snip]

  The best starting point is the "high performance computing" task
view on CRAN, which gives an (as far as I know) up-to-date description
of the various packages that are available for handing large/out-of-memory
data sets. Some of these are RDBMS interfaces, some are systems for
file-backed objects, some are large/out-of-memory algorithms such as
lm or glm for big objects.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to