You example is NOT self contained, which means that any potential respondent must guess what you mean by "a function with a variable of almost 200,000". The following clarifies this:

> start0 <- rep(1, 200000)
> msLE2 <- function(x)sum(x^2)
> nlminb(start=start0, msLE2, control = list(x.tol = .001))
Error in vector("double", length) : vector size specified is too large


"traceback()" reveals that this error message was generated in by 'vector("double", length)', where length = 130 + (n * (n + 27))/2), and n = length(start) = 200,000 in this case. This is 20e9 double precision numbers or 160 GB. This suggests you need to rethink what you are trying to do.

In my opinion, in any problem with more than a fairly small number of unknowns, e.g., 3 or 12 depending on the complexity of the problem, the vast majority of the unknowns will be better estimated by considering them as different samples from some abstract population and trying to estimate first the hyperparameters of that population and then the individuals conditioned on the hyperparameters. The most general tools for that kind of thing in R are in the 'nlme' and 'lme4' packages. To understand those, I highly recommend Pinheiro and Bates (2000) Mixed-Effects Models in S and S-PLUS (Springer). If your observations can not reasonably be considered by mixed-effects models with normal errors, a second reference is Gelman and Hill (2006) Data Analysis Using Regression and Multilevel/Hierarchical Models (Cambridge University Press). If neither of those seem adequate to your problem, I suggest you consider using the "RSiteSearch.function" in the RSiteSearch package to look for other capabilities in R related to your particular application.

Hope this helps. Spencer Graves

David Winsemius wrote:

On May 30, 2009, at 2:19 PM, popo UBC wrote:

Hello everyone!

When I use "nlminb" to minimize a function with a variable of almost 200,000
dimension, I got the following error.

nlminb(start=start0, msLE2, control = list(x.tol = .001))
Error in vector("double", length) : vector size specified is too large
I had the following setting

options(expressions=60000)
options(object.size=10^15)

That would do nothing on my machine, but then you may have a different (unspecified) OS. You may have unrealistic expectations. 10^15 seems a bit optimistic to me, even if you were supplying that number in a manner that R would recognize.

?mem.limits   #  should give you information specific to your OS.

If you use Windoze, try also:

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#How-do-I-install-R-for-Windows_003f


I have no idea about what might be wrong. Any suggestion is highly
appreciated!!

And we have no idea what sort of setup you have. You could, of course, read the specifics for your OS in the Installation Guide:

cran.r-project.org/doc/manuals/R-admin.pdf


______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to