1. Don't allocate it.
2. If it was, would it make a difference?
Seriously, some algorithms need more memory than others, and some packages are
more wasteful than others. R is not monolithic... sometimes you just have to
roll up your sleeves or buy more memory.
--
Sent from my phone. Please
I checked the issue on different forums like stackoverflow. The issue is
related to Out Of Memory (OOM) linux feature which kills processes that
consume large memory and swap. I started to monitor the memory and swap
consumption while VAR model was running. The R consumed all 32 GB of RAM
memory an
I am using *R version 3.0.2* (2013-09-25) on Ubuntu desktop (*Ubuntu
14.04.4 LTS*). I am running *var model *on a matrix with 199 columns and
604800 rows. The server has 12 core and 32GB of memory. When the model is
running, i checked CPU and Memory consumption using 'htop' linux command. I
observe
Standard reply (see posting guide):
Update to the current version of R (3.3.0 or so) and retry. Your
version is old -- this often leads to incompatibilities with newer
software versions.
Cheers,
Bert
Bert Gunter
"The trouble with having an open mind is that people keep coming along
and sticking
Wild guess: You have huge and high dimensional VAR models, i.e. the
matrices get huge and you use huge amounts of memory and you use more
than what is available physically. The operating system protects itself
by killing processes in such a case...
Best,
Uwe Ligges
On 31.05.2016 20:29, Vivek
Hi,
I am using VARS (vector autoregressive model). The process gets killed
after running for sometime. Following is the output of R.
vivek@isds-research:~/cloudAuction/padding/panel$ cat var.Rout
R version 3.0.2 (2013-09-25) -- "Frisbee Sailing"
Copyright (C) 2013 The R Foundation for Statistica
6 matches
Mail list logo