Hi,
You really need to study the documentation of "optim" carefully before you make
broad generalizations. There are several algorithms available in optim. The
default is a simplex-type algorithm called Nelder-Mead. I think this is an
unfortunate choice as the default algorithm. Nelder-Mea
And there I caught myself with the next blooper: it wasn't Ben Bolker, it was
Bert Gunter who pointed that out. :)
Daniel Malter wrote:
>
> Ben Bolker sent me a private email rightfully correcting me that was
> factually wrong when I wrote that ML /is/ a numerical method (I had
> written sloppil
Ben Bolker sent me a private email rightfully correcting me that was
factually wrong when I wrote that ML /is/ a numerical method (I had written
sloppily and under time pressure). He is of course right to point out that
not all maximum likelihood estimators require numerical methods to solve.
Furth
Oh, I think I got it. Commercial packages limit the number of decimals
shown.
--
View this message in context:
http://r.789695.n4.nabble.com/Poor-performance-of-Optim-tp3862229p3864271.html
Sent from the R help mailing list archive at Nabble.com.
_
What I tried is just a simple binary probit model. Create a random data and
use "optim" to maximize the log-likelihood function to estimate the
coefficients. (e.g. u = 0.1+0.2*x + e, e is standard normal. And y = (u >
0), y indicating a binary choice variable)
If I estimate coefficient of "x
Thank you for your response!
But the problem is when I estimate a model without knowing the true
coefficients, how can I know which "reltol" is good enough? "1e-8" or
"1e-10"? Why can commercial packages automatically determine the right
"reltol" but R cannot?
--
View this message in context:
h
With respect, your statement that R's optim does not give you a reliable
estimator is bogus. As pointed out before, this would depend on when optim
believes it's good enough and stops optimizing. In particular if you stretch
out x, then it is plausible that the likelihood function will become flat
Have you considered the "optimx" package? I haven't tried it,
but it was produced by a team of leading researchers in nonlinear
optimization, including those who wrote most of "optim"
(http://user2010.org/tutorials/Nash.html) years ago.
There is a team actively working on this
Le 01/10/11 08:12, yehengxin a écrit :
I used to consider using R and "Optim" to replace my commercial packages:
Gauss and Matlab. But it turns out that "Optim" does not converge
completely.
What it means "completely" ?
The same data for Gauss and Matlab are converged very well. I
see that th
Is there a question or point to your message or did you simply feel
the urge to inform the entire R-help list of the things that you
consider?
Josh
On Fri, Sep 30, 2011 at 11:12 PM, yehengxin wrote:
> I used to consider using R and "Optim" to replace my commercial packages:
> Gauss and Matlab.
-Original Message-
From: r-help-boun...@r-project.org on behalf of yehengxin
Sent: Sat 10/1/2011 8:12 AM
To: r-help@r-project.org
Subject: [R] Poor performance of "Optim"
I used to consider using R and "Optim" to replace my commercial packages:
Gauss and Matlab. B
I used to consider using R and "Optim" to replace my commercial packages:
Gauss and Matlab. But it turns out that "Optim" does not converge
completely. The same data for Gauss and Matlab are converged very well. I
see that there are too many packages based on "optim" and really doubt if
they can
12 matches
Mail list logo