Thanks Anupam for your inputs. I believe there are two ways to circumvent the issue...1> making the code more efficient 1> Increasing the memory in some way.The reasons why I did not paste the code are 1> It is long and using quite a number of functions that I created 2> Secondly my intention is not to make the code more efficient if that's possible. Here landing into a memory problem with 2 GB RAM is natural as my analysis entails 1500 simulations from huge multivariate distributions that change after every simulation and tomorrow I may have to do similar analysis with 10 million observations * 20 columns.
In view of above I shall be needing more memory sometime later and my IT friends are ready to support me for that(probably with a sandbox) but I am not sure whether I can install probably a servor version of R that can be capable of working with 8GB or so RAM. So it is more of technical help I need and I have no clue regarding the plausibility of the solution mentioned( i.e. a servor version of R that is capable of more memory). Regards, Abhisek On Sat, Jun 11, 2011 at 10:10 AM, Anupam <anupa...@gmail.com> wrote: > > It will be helpful on this forum to use metric measures: 12 Lakh is 1.2 > million, thus your data is 1.2 million observations x 15 variables. I do not > know the intricacies of R. You may have to wait for someone with that > knowledge to respond. > > Including some relevant portions of error messages and code in your query > can also help someone to respond to your message. > > Anupam. > -----Original Message----- > From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On > Behalf Of Abhisek Saha > Sent: Saturday, June 11, 2011 6:25 AM > To: r-help@r-project.org > Subject: [R] Memory(RAM) issues > > Dear All, > I have been working with R(desktop version) in VISTA. I have the latest > version R 2.13.0. I have been working with few data-sets of size 12Lakh * 15 > and my code is quite computing intensive ( applying MCMC gibbs sampler on a > posterior of 144 variables) that often runs into a memory issue such as > memory can not allocate the vector ,full size(shows to have reached > something like 1.5 GB) reached or something to this effect. I have a RAM of > 2GB. I checked with the option like memory.size and it says a 64 bit R if > sat on 64 bit windows then max memory capability is 8TB. > > Now I don't have background to understand the definitions and differences > between 32 and 64 bit machines and other technical requirements like servor > etc but it would be of great help if anyone can let me have a feel of it. > Could any of you tell me whether some servor version of R would resolve my > issue or not (I am not sure now what kind of servor my company would allow R > to be installed at this point ,may be linux type) and if that's the case > could any of you guide me about how to go about installing that onto a > sevor. > > Thank you, > Abhisek > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.