I've read in Phil Spector's new book that it's a good idea to  
preallocate a big matrix, like

u <- matrix(0,nrow,ncol) # (1)

Now, I read contents of a huge matrix from a Fortran binary dump.

u <- readBin(con,what="double",n=nrow*ncol) # (2)

If I do (1) and then (2), u is a vector, obviously it's either  
reallocated or its matrix nature is lost -- overridden?  overwritten?

Instead, I do it now as

u <-  
matrix(readBin(con,what="double",n=nrow*ncol),nrow=nrow,ncol=ncol) # (3)

What's going on with memory management here and what's the right way  
to make it efficient -- and how to preallocate?

After that, I'm saving u as R binary object in an rda file.  Does it  
make sense to preallocate u before reading it back now from the rda  
file?

Cheers,
Alexy

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to