On Nov 20, 2011, at 7:26 PM, 屠鞠传礼 wrote:
Thank you very much :)
I search on net and find sometimes response value in logistic model
can have more than 2 values, and the way of this kinds of regression
is called "Ordinal Logistic Regression". and even we can caculate it
by the same way I mean glm in R.
here are some references:
1. http://en.wikipedia.org/wiki/Ordered_logit
2. http://www.stat.ubc.ca/~rollin/teach/643w04/lec/node62.html
above two tell us what is "Ordinal Logistic Regression".
3. http://www.ats.ucla.edu/stat/r/dae/ologit.htm
this show that we can use glm to model it
When I looked through the UCLA code it appeared they were using the
Design package (now superseded by the `rms` package) and that the
function was `lrm` rather than `glm`. In addition to Harrell's
excellent text which has a full chapter on this topic you might also
want to look at Laura Thompson's Companion to Agresti's text:
https://home.comcast.net/~lthompson221/Splusdiscrete2.pdf
--
David.
ÔÚ 2011-11-21 00:56:33£¬"Uwe Ligges" <lig...@statistik.tu-
dortmund.de> дµÀ£º
On 20.11.2011 17:27, ÍÀ¾Ï´«Àñ wrote:
I worried it too, Do you have idear that what tools I can use?
Depends on your aims - what you want to do with the fitted model.
A multinomial model, some kind of discriminant analysis (lda, qda),
tree
based methods, svm and so son come to mind. You probably want to
discuss
this on some statistics mailing list/forum or among local experts
rather
than on the R list. Since this is actually not that R releated.
Uwe Ligges
ÔÚ 2011-11-21 00:13:26£¬"Uwe Ligges"<lig...@statistik.tu-
dortmund.de> дµÀ£º
On 20.11.2011 16:58, ÍÀ¾Ï´«Àñ wrote:
Thank you Ligges :)
one more question:
my response value "diagnostic" have 4 levels (0, 1, 2 and 3), so
I use it like this:
"as.factor(diagnostic) ~ as.factor(7161521) +as.factor(2281517)"
Is it all right?
Uhh. 4 levels? Than I doubt logistic regression is the right tool
for
you. Please revisit the theory first: It is intended for 2
levels...
Uwe Ligges
ÔÚ 2011-11-20 23:45:23£¬"Uwe Ligges"<lig...@statistik.tu-dortmun
d.de> дµÀ£º
On 20.11.2011 12:46, tujchl wrote:
HI
I use glm in R to do logistic regression. and treat both
response and
predictor as factor
In my first try:
*******************************************************************************
Call:
glm(formula = as.factor(diagnostic) ~ as.factor(7161521) +
as.factor(2281517), family = binomial())
Deviance Residuals:
Min 1Q Median 3Q Max
-1.5370 -1.0431 -0.9416 1.3065 1.4331
Coefficients:
Estimate Std. Error z value Pr(>|z|)
(Intercept) -0.58363 0.27948 -2.088 0.0368 *
as.factor(7161521)2 1.39811 0.66618 2.099 0.0358 *
as.factor(7161521)3 0.28192 0.83255 0.339 0.7349
as.factor(2281517)2 -1.11284 0.63692 -1.747 0.0806 .
as.factor(2281517)3 -0.02286 0.80708 -0.028 0.9774
---
Signif. codes: 0 ¡®***¡¯ 0.001 ¡®**¡¯ 0.01 ¡®*¡¯
0.05 ¡®.¡¯ 0.1 ¡® ¡¯ 1
(Dispersion parameter for binomial family taken to be 1)
Null deviance: 678.55 on 498 degrees of freedom
Residual deviance: 671.20 on 494 degrees of freedom
AIC: 681.2
Number of Fisher Scoring iterations: 4
*******************************************************************************
And I remodel it and *want no intercept*:
*******************************************************************************
Call:
glm(formula = as.factor(diagnostic) ~ as.factor(2281517) +
as.factor(7161521) - 1, family = binomial())
Deviance Residuals:
Min 1Q Median 3Q Max
-1.5370 -1.0431 -0.9416 1.3065 1.4331
Coefficients:
Estimate Std. Error z value Pr(>|z|)
as.factor(2281517)1 -0.5836 0.2795 -2.088 0.0368 *
as.factor(2281517)2 -1.6965 0.6751 -2.513 0.0120 *
as.factor(2281517)3 -0.6065 0.8325 -0.728 0.4663
as.factor(7161521)2 1.3981 0.6662 2.099 0.0358 *
as.factor(7161521)3 0.2819 0.8325 0.339 0.7349
---
Signif. codes: 0 ¡®***¡¯ 0.001 ¡®**¡¯ 0.01 ¡®*¡¯
0.05 ¡®.¡¯ 0.1 ¡® ¡¯ 1
(Dispersion parameter for binomial family taken to be 1)
Null deviance: 691.76 on 499 degrees of freedom
Residual deviance: 671.20 on 494 degrees of freedom
AIC: 681.2
Number of Fisher Scoring iterations: 4
*******************************************************************************
*As show above in my second model it return no intercept but
look this:
Model1:
(Intercept) -0.58363 0.27948 -2.088 0.0368 *
Model2:
as.factor(2281517)1 -0.5836 0.2795 -2.088 0.0368 **
They are exactly the same. Could you please tell me what happen?
Actually it does not make sense to estimate the model without an
intercept unless you assume that it is exactly zero for the
first levels
of your factors. Think about the contrasts you are interested
in. Looks
like not the default?
Uwe Ligges
Thank you in advance
--
View this message in context:
http://r.789695.n4.nabble.com/logistic-regression-by-glm-tp4088471p4088471.html
Sent from the R help mailing list archive at Nabble.com.
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible
code.
[[alternative HTML version deleted]]
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
David Winsemius, MD
West Hartford, CT
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.