Hi! Thanks for your reply! After running the command below I am certain I am using a 64-bit R. I am running R through a linux cluster system where R is globally available for all users. I have asked the system administrators if they would update their version R but they are not receptive of making the change. If I must, I will try to install an updated version in my local directory.
Though before I do that I want to make sure there are no other underlying issues I should consider. Could there be something else I need to look into? > .Machine$sizeof.pointer [1] 8 Thank you! 2010/5/21 Uwe Ligges <lig...@statistik.tu-dortmund.de> > At first, I'd try with an R version from 2010 rather than one from 2007. > Next, I'd try to be sure to really have a 64-bit version of R rather than a > 32 bit one which is what I suspect. > > Best, > Uwe Ligges > > > > On 20.05.2010 20:10, Yesha Patel wrote: > >> I've looked through all of the posts about this issue (and there are >> plenty!) but I am still unable to solve the error. ERROR: cannot allocate >> vector of size 455 Mb >> >> I am using R 2.6.2 - x86_64 on a Linux x86_64 Redhat cluster system. When >> I >> log in, based on the specs I provide [qsub -I -X -l arch=x86_64] I am >> randomly assigned to a x86_64 node. >> >> I am using package GenABEL. My data (~ 650,000 SNPs, 3,000 people) loads >> in >> okay and I am able to look at the data using basic commands [nids, nsnps, >> names(phdata)] >> >> The problem occurs when I try to run the extended analysis: xs<- >> mlreg(GASurv(age,dm2)~sex,dta) >> >> ****************** >> >> 1) I have looked through the memory limits on R >> mem.limits() >> nsize vsize >> NA NA >> >> 2) Code: >> gc() >> used (Mb) gc trigger (Mb) max used (Mb) >> Ncells 961605 51.4 1710298 91.4 1021138 54.6 >> Vcells 64524082 492.3 248794678 1898.2 68885474 525.6 >> >> gc(reset=TRUE) >> used (Mb) gc trigger (Mb) max used (Mb) >> Ncells 961119 51.4 1710298 91.4 961119 51.4 >> Vcells 64523417 492.3 199035742 1518.6 64523417 492.3 >> >> 3) Linux Memory Allocation - Note: Max Memory Size, Virtual Memory& Stack >> Size are all unlimited >> >> bash-3.2$ core file size (blocks, -c) 0 >> data seg size (kbytes, -d) unlimited >> scheduling priority (-e) 0 >> file size (blocks, -f) unlimited >> pending signals (-i) 31743 >> max locked memory (kbytes, -l) 32 >> max memory size (kbytes, -m) unlimited >> open files (-n) 32768 >> pipe size (512 bytes, -p) 8 >> POSIX message queues (bytes, -q) 819200 >> real-time priority (-r) 0 >> stack size (kbytes, -s) unlimited >> cpu time (seconds, -t) unlimited >> max user processes (-u) 31743 >> virtual memory (kbytes, -v) unlimited >> file locks (-x) unlimited >> >> 4) free -mt >> total used free shared buffers cached >> Mem: 3901 99 3802 0 1 24 >> -/+ buffers/cache: 73 3827 >> Swap: 1027 37 990 >> Total: 4929 136 4792 >> >> >> 5) ps -u >> >> USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND >> xxxxxx 22352 0.0 0.0 65136 956 pts/0 S 03:09 0:00 -tcsh >> xxxxxx 22354 0.0 0.0 13496 1792 pts/0 S 03:09 0:00 >> /usr/sbin/pbs_m >> xxxxxx 22355 0.0 0.0 6232 60 pts/0 S 03:09 0:00 pbs_demux >> xxxxxx 29872 0.0 0.0 63736 920 pts/0 R+ 09:45 0:00 ps -u >> >> ****************** >> >> Any solutions? Thank you! >> >> [[alternative HTML version deleted]] >> >> ______________________________________________ >> R-help@r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide >> http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >> > [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.