>>>>> "MZ" == Mauricio Zambrano <hzambran.newsgro...@gmail.com> >>>>> on Mon, 17 Jan 2011 11:46:44 +0100 writes:
MZ> Dear R community, MZ> I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a MZ> PAE kernel, as you can see here: MZ> $ uname -a MZ> Linux mymachine 2.6.18-238.el5PAE #1 SMP Sun Dec 19 14:42:44 EST 2010 MZ> i686 i686 i386 GNU/Linux MZ> When I try to create a large matrix ( Q.obs <- matrix(NA, nrow=6940, MZ> ncol=9000) ), I got the following error: >> Error: cannot allocate vector of size 238.3 Mb MZ> However, the amount of free memory in my machine seems to be much MZ> larger than this: MZ> system("free") MZ> \ total used free shared buffers cached MZ> Mem: 12466236 6354116 6112120 0 67596 2107556 MZ> -/+ buffers/cache: 4178964 8287272 MZ> Swap: 12582904 0 12582904 MZ> I tried to increase the memory limit available for R by using: MZ> $ R --min-vsize=10M --max-vsize=5000M --min-nsize=500k --max-nsize=5000M MZ> but it didn't work. MZ> Any hint about how can I get R using all the memory available in the machine ? Install a 64-bit version of Linux, i.e., ubuntu in your case and work from there. I don't think there's a way around that. Martin ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.