The weights in 'aa' are the inverse standard deviations. But you want to use the inverse variances as the weights:
aa <- (attributes(summary(f1)$modelStruct$varStruct)$weights)^2 And then the results are essentially identical. Best, -- Wolfgang Viechtbauer http://www.wvbauer.com/ Department of Methodology and Statistics Tel: +31 (0)43 388-2277 School for Public Health and Primary Care Office Location: Maastricht University, P.O. Box 616 Room B2.01 (second floor) 6200 MD Maastricht, The Netherlands Debyeplein 1 (Randwyck) ----Original Message---- From: [email protected] [mailto:[email protected]] On Behalf Of Stats Wolf Sent: Thursday, June 24, 2010 15:00 To: Joris Meys Cc: [email protected] Subject: Re: [R] Question on WLS (gls vs lm) > Indeed, they should give the same results, and hence I was worried to > see that the results were not that same. Suffice it to look at > standard errors and p-values: they do differ, and the differences are > not really that small. > > > Thanks, > Stats Wolf > > > > On Thu, Jun 24, 2010 at 2:39 PM, Joris Meys <[email protected]> > wrote: >> Indeed, WLS is a special case of GLS, where the error covariance >> matrix is a diagonal matrix. OLS is a special case of GLS, where the >> error is considered homoscedastic and all weights are equal to 1. And >> I now realized that the varIdent() indeed makes a diagonal covariance >> matrix, so the results should be the same in fact. Sorry for missing >> that one. >> >> A closer inspection shows that the results don't differ too much. The >> fitting method differs between both functions; lm.wfit uses the QR >> decomposition, whereas gls() uses restricted maximum likelihood. In >> Asymptopia, they should give the same result. >> >> Cheers >> Joris >> >> >> On Thu, Jun 24, 2010 at 12:54 PM, Stats Wolf <[email protected]> >> wrote: >>> Thanks for reply. >>> >>> Yes, they do differ, but does not gls() with the weights argument >>> (correlation being unchanged) make the special version of GLS, as >>> this sentence from the page you provided says: "The method leading >>> to this result is called Generalized Least Squares estimation >>> (GLS), of which WLS is just a special case"? >>> >>> Best, >>> Stats Wolf >>> >>> >>> >>> On Thu, Jun 24, 2010 at 12:49 PM, Joris Meys <[email protected]> >>> wrote: >>>> Isn't that exactly what you would expect when using a _generalized_ >>>> least squares compared to a normal least squares? GLS is not the >>>> same as WLS. >>>> >>>> http://www.aiaccess.net/English/Glossaries/GlosMod/e_gm_least_square >>>> s_generalized.htm >>>> >>>> Cheers >>>> Joris >>>> >>>> On Thu, Jun 24, 2010 at 9:16 AM, Stats Wolf <[email protected]> >>>> wrote: >>>>> Hi all, >>>>> >>>>> I understand that gls() uses generalized least squares, but I >>>>> thought that maybe optimum weights from gls might be used as >>>>> weights in lm (as shown below), but apparently this is not the >>>>> case. See: >>>>> >>>>> library(nlme) >>>>> f1 <- gls(Petal.Width ~ Species / Petal.Length, data = iris, >>>>> weights = varIdent(form = ~ 1 | Species)) aa <- >>>>> attributes(summary(f1)$modelStruct$varStruct)$weights >>>>> f2 <- lm(Petal.Width ~ Species / Petal.Length, data = iris, >>>>> weights = aa) >>>>> >>>>> summary(f1)$tTable; summary(f2) >>>>> >>>>> So, the two models with the very same weights do differ (in terms >>>>> of standard errors). Could you please explain why? Are these >>>>> different types of weights? >>>>> >>>>> Many thanks in advance, >>>>> Stats Wolf >>>>> >>>>> ______________________________________________ >>>>> [email protected] mailing list >>>>> https://stat.ethz.ch/mailman/listinfo/r-help >>>>> PLEASE do read the posting guide >>>>> http://www.R-project.org/posting-guide.html >>>>> and provide commented, minimal, self-contained, reproducible code. >>>>> >>>> >>>> >>>> >>>> -- >>>> Joris Meys >>>> Statistical consultant >>>> >>>> Ghent University >>>> Faculty of Bioscience Engineering >>>> Department of Applied mathematics, biometrics and process control >>>> >>>> tel : +32 9 264 59 87 >>>> [email protected] >>>> ------------------------------- >>>> Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php >>>> >>> >> >> >> >> -- >> Joris Meys >> Statistical consultant >> >> Ghent University >> Faculty of Bioscience Engineering >> Department of Applied mathematics, biometrics and process control >> >> tel : +32 9 264 59 87 >> [email protected] >> ------------------------------- >> Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php >> > > ______________________________________________ > [email protected] mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. ______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.

