I think your imprecise use of statistical methods is getting you into trouble.
A literal interpretation of your question would lead to var(my.data$fluo), but
whether that number would be meaningful would depend on what you did with it (I
doubt much good would come from using it directly). Unfort
The difference is that survreg is using a maximum likelihood estimate (MLE) of the
variance and that lm is using the unbiased (MVUE) estimate of variance. For simple linear
regression, the former divides by "n" and the latter by "n-p". The difference in your
variances is exactly n/(n-p) = 10/8
ll open a
> separate thread
> in the case.
>
> Thanks.
>
> ---
>
> Giorgio
>
> Genoa, Italy
>
> From: Tsjerk Wassenaar [mailto:tsje...@gmail.com]
> Sent: domenica 10 maggio 2015 22:31
> To: Giorgio Garziano
> Cc: r-help@r-project.org
> Subject: Re: [R] Va
: Re: [R] Variance-covariance matrix
Hi Giorgio,
This is for a multivariate time series. x1 is variable 1 of the observation
vector x, x2, variable 2, etc. If you need x(i) and x(i+1), etc, then you're
looking for the autocovariance/autocorrelation matrix, which is a quite
different thing
nce: “Time series and its applications – with R examples”,
> Springer,
>
> $7.8 “Principal Components” pag. 468, 469
>
>
>
> Cheers,
>
>
>
> Giorgio
>
>
>
>
>
> *From:* Tsjerk Wassenaar [mailto:tsje...@gmail.com]
> *Sent:* domenica 10 mag
-project.org
Subject: Re: [R] Variance-covariance matrix
Hi Giorgio,
For a univariate time series? Seriously?
data <- rnorm(10,2,1)
as.matrix(var(data))
Cheers,
Tsjerk
On Sun, May 10, 2015 at 9:54 PM, Giorgio Garziano
mailto:giorgio.garzi...@ericsson.com>> wrote:
Hi,
Actually as
ata.center)
>
> --
> Giorgio Garziano
>
>
> -Original Message-
> From: David Winsemius [mailto:dwinsem...@comcast.net]
> Sent: domenica 10 maggio 2015 21:27
> To: Giorgio Garziano
> Cc: r-help@r-project.org
> Subject: Re: [R] Variance-covariance matrix
>
&g
lt;- (1/(n-1)) * data.center %*% t(data.center)
--
Giorgio Garziano
-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net]
Sent: domenica 10 maggio 2015 21:27
To: Giorgio Garziano
Cc: r-help@r-project.org
Subject: Re: [R] Variance-covariance matrix
On May 10, 2015, at 4:27
On May 10, 2015, at 4:27 AM, Giorgio Garziano wrote:
> Hi,
>
> I am looking for a R package providing with variance-covariance matrix
> computation of univariate time series.
>
> Please, any suggestions ?
If you mean the auto-correlation function, then the stats package (loaded by
default at
I suspect that this is the long-documented issue with indeed an entire industry
-- and publications -- devoted to finding such errors in Excel. Till the 2013
version, it used to be a favorite HW problem of mine. Basically, Excel uses the
"short formula" to calculate the variance and the sd. This
[See at end]
On 09-Feb-2015 21:45:11 David L Carlson wrote:
> Time for a new version of Excel? I cannot duplicate your results in Excel
> 2013.
>
> R:
>> apply(dat, 2, var)
> [1] 21290.80 24748.75
>
> Excel 2013:
> =VAR.S(A2:A21) =VAR.S(B2:B21)
> 21290.8 24748.74737
>
> -
Time for a new version of Excel? I cannot duplicate your results in Excel 2013.
R:
> apply(dat, 2, var)
[1] 21290.80 24748.75
Excel 2013:
=VAR.S(A2:A21) =VAR.S(B2:B21)
21290.8 24748.74737
-
David L Carlson
Department of Anthropology
Texas A&M Univer
>>>> On 04/11/14 16:13, PIKAL Petr wrote:
>>>>>> Hi
>>>>>>
>>>>>>> -Original Message-
>>>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>>>> project.org] On
Hi
>>>>>
>>>>>> -Original Message-
>>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>>> project.org] On Behalf Of CJ Davies
>>>>>> Sent: Tuesday, November 04, 2014 2:50 PM
>>>>
-
>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>> project.org] On Behalf Of CJ Davies
>>>>> Sent: Tuesday, November 04, 2014 2:50 PM
>>>>> To: Jim Lemon; r-help@r-project.org
>>>>> Subject: Re:
PM
To: Jim Lemon; r-help@r-project.org
Subject: Re: [R] Variance of multiple non-contiguous time periods?
On 04/11/14 09:11, Jim Lemon wrote:
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
...
On 30/10/14 21:33, Jim Lemon wrote:
If I understand, you mean to calculate deviations for each
day, November 04, 2014 2:50 PM
>>> To: Jim Lemon; r-help@r-project.org
>>> Subject: Re: [R] Variance of multiple non-contiguous time periods?
>>>
>>> On 04/11/14 09:11, Jim Lemon wrote:
>>>> On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
>>>
On 04/11/14 16:13, PIKAL Petr wrote:
Hi
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
project.org] On Behalf Of CJ Davies
Sent: Tuesday, November 04, 2014 2:50 PM
To: Jim Lemon; r-help@r-project.org
Subject: Re: [R] Variance of multiple non-contiguous
Hi
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
> project.org] On Behalf Of CJ Davies
> Sent: Tuesday, November 04, 2014 2:50 PM
> To: Jim Lemon; r-help@r-project.org
> Subject: Re: [R] Variance of multiple non-contiguous time per
On 04/11/14 09:11, Jim Lemon wrote:
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
...
On 30/10/14 21:33, Jim Lemon wrote:
If I understand, you mean to calculate deviations for each individual
'chunk' of each transition & then aggregate the results? This is what
I'd been thinking about, but is
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
> ...
> On 30/10/14 21:33, Jim Lemon wrote:
> If I understand, you mean to calculate deviations for each individual
> 'chunk' of each transition & then aggregate the results? This is what
> I'd been thinking about, but is there a sensible manner withi
On 30/10/14 21:33, Jim Lemon wrote:
> On Fri, 31 Oct 2014 07:19:01 AM Jim Lemon wrote:
>> On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
>>> I am trying to show that the red line ('yaw') in the upper of the two
>>> plots here;
>>>
>>> http://i.imgur.com/N4Xxb4f.png
>>>
>>> varies more within the
On Fri, 31 Oct 2014 07:19:01 AM Jim Lemon wrote:
> On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
> > I am trying to show that the red line ('yaw') in the upper of the two
> > plots here;
> >
> > http://i.imgur.com/N4Xxb4f.png
> >
> > varies more within the pink sections ('transition 1') than i
On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
> I am trying to show that the red line ('yaw') in the upper of the two
> plots here;
>
> http://i.imgur.com/N4Xxb4f.png
>
> varies more within the pink sections ('transition 1') than in the light
> blue sections ('real').
>
> I tried to use var.t
You've stumbled across the answer to your question --
while lm() supports y~X formulas without a data=argument
and y~ X1+X2+X3 formulas with one, you can't depend on
all contributed functions to do the same.
As John pointed out, the advantage of car::vif over other
implementations is that it cor
Dear Martin,
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Behalf Of Martin H. Schmidt
> Sent: Thursday, September 20, 2012 8:52 AM
> To: r-help@r-project.org
> Subject: [R] Variance Inflation Factor VIC() with a matrix
>
> Hi everyon
-Original Message-
> From: R. Michael Weylandt [mailto:michael.weyla...@gmail.com]
> Sent: Wednesday, July 11, 2012 4:04 PM
> To: Hui Du
> Cc: Jorge I Velez; R-help
> Subject: Re: [R] Variance Inflation factor
>
> You're rather out of date with your version of R -- if y
ginal Message-
From: R. Michael Weylandt [mailto:michael.weyla...@gmail.com]
Sent: Wednesday, July 11, 2012 4:04 PM
To: Hui Du
Cc: Jorge I Velez; R-help
Subject: Re: [R] Variance Inflation factor
You're rather out of date with your version of R -- if you want to use
the CRAN binaries p
n_US.UTF-8 LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats graphics grDevices utils datasets methods base
>
> HXD
>
> From: Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> Sent: Wednesday, July 11, 2012 3:31 PM
> To: Hui Du
> Cc: R-help
> Subject: Re:
vailable
> >
> > ** **
> >
> > HXD
> >
> > ** **
> >
> > *From:* Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> > *Sent:* Wednesday, July 11, 2012 3:19 PM
> > *To:* Hui Du
> > *Cc:* R-help
> > *Subject:* Re:
PM
To: Hui Du
Cc: R-help
Subject: Re: [R] Variance Inflation factor
Could you please include your sessionInfo() ?
Thank you,
Jorge.-
On Wed, Jul 11, 2012 at 6:27 PM, Hui Du
mailto:hui...@dataventures.com>> wrote:
Thanks. But in UNIX side, I got the same error
In getDependencies(pkgs, dep
râ is not available
>
> ** **
>
> HXD
>
> ** **
>
> *From:* Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> *Sent:* Wednesday, July 11, 2012 3:19 PM
> *To:* Hui Du
> *Cc:* R-help
> *Subject:* Re: [R] Variance Inflation factor
>
> ** **
>
> S
Thanks. But in UNIX side, I got the same error
In getDependencies(pkgs, dependencies, available, lib) :
package ââ¬Ëcarââ¬â¢ is not available
HXD
From: Jorge I Velez [mailto:jorgeivanve...@gmail.com]
Sent: Wednesday, July 11, 2012 3:19 PM
To: Hui Du
Cc: R-help
Subject: Re: [R] Variance
See the examples at
# install.pacages('car')
require(car)
?vif
HTH,
Jorge.-
On Wed, Jul 11, 2012 at 6:10 PM, Hui Du <> wrote:
> Hi All,
>
>
> I need to calculate VIF (variance inflation factor) for my linear
> regression model. I found there was a function named vif in 'HH' package.
> I have
On Fri, Jun 22, 2012 at 5:13 AM, Mohan Radhakrishnan wrote:
> Hi,
>
>
>
> Is there a way to calculate variance directly by specifying
> confidence interval using R ? I am specifically asking because I wanted
> to investigate how this could be useful for project schedule variance
> calculation
On 22 Feb 2012, at 14:01, Terry Therneau wrote:
> --- begin included message ---
> I have a left truncated, right censored cox model:
>
> coxph(Surv(start, stop, censor) ~ x + y, mydata)
>
> I would like to know how much of the observed variance (as a number
> between 0 and 1) is explained by ea
--- begin included message ---
I have a left truncated, right censored cox model:
coxph(Surv(start, stop, censor) ~ x + y, mydata)
I would like to know how much of the observed variance (as a number
between 0 and 1) is explained by each variable. How could I do that?
Adding terms sequentially an
Dear Prof. Wood,
I read your methods of extracting the variance explained by each
predictor in different places. My question is: using the method you
suggested, the sum of the deviance explained by all terms is not equal to
the deviance explained by the full model. Could you tell me what caused
Hi,
Searching on http://www.rseek.org for "variance ratio test" turns up the
vrtest package, as does searching for Lo and Mackinlay,
suggesting that's a good place to start.
Sarah
On Wed, Oct 5, 2011 at 2:48 PM, rauf ibrahim wrote:
> Hello,
> I am looking for a code in R for the variance ratio
Oh silly me--and I've been staring at that for a good hour. Thank you and
I'll keep your advice in mind.
On Thu, Apr 28, 2011 at 6:24 PM, Andrew Robinson <
a.robin...@ms.unimelb.edu.au> wrote:
> A couple of points here
>
> First, note that q doesn't increment in the code below. So, you're
>
A couple of points here
First, note that q doesn't increment in the code below. So, you're
getting the same variance each time.
Second, note that (t$Rec1==input3 & t$Rec2==input4) evaluates to F?T
or 0/1, and it's not clear from your code if that is what you intend.
Finally, it's much easi
Hi:
I didn't see anything on first blush from the mod1 or summary(mod1) objects,
but it's not too hard to compute:
> names(mod1)
[1] "coefficients" "icoef" "var"
[4] "var2" "loglik""iter"
[7] "linear.predictors" "frail" "fvar"
[10] "df"
Fantastic! it's solved! Thank you very much Bill!
Barbara
--- On Wed, 7/28/10, bill.venab...@csiro.au wrote:
> From: bill.venab...@csiro.au
> Subject: RE: [R] Variance-covariance matrix from GLM
> To: bojuanz...@yahoo.com, r-help@r-project.org
> Date: Wednesday, July
?vcov ### now in the stats package
You would use
V <- vcov(my.glm)
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of Bojuan Zhao
Sent: Thursday, 29 July 2010 9:52 AM
To: r-help@r-project.org
Subject: [R] Variance-covariance matr
Thank you for response.
For question 2,
Since I need to know the expectation of Y for new observations, let's say
X*.
So I need to know the expectation and also the variance of log (Y|X*).
I know 'fitted(lin)' will give me the E[log(Y|X*)]. But I do not know how to
get var[log(Y|X*)] or say sd[
Hi:
On Wed, Jul 21, 2010 at 2:29 PM, Yi wrote:
> Hi, folks,
>
> Here are the codes:
>
> ##
> y=1:10
> x=c(1:9,1)
> lin=lm(log(y)~x) ### log(y) is following Normal distribution
> x=5:14
> prediction=predict(lin,newdata=x) ##prediction=predict(lin)
> ###
>
predict() need
Sorry, for the second question. I stated in a wrong way. My aim is the mean
and sd of each new observation.
#
mean=fitted(prediction)
##
But I do not know how to get sd for each new observation.
Any tips?
Thanks
Yi
On Wed, Jul 21, 2010 at 2:29 PM, Yi wrote:
> Hi, folks,
>
> Here a
On Mon, Mar 8, 2010 at 3:44 PM, casperyc wrote:
>
> Hi Rolf Turner ,
>
> God, it directed to the wrong page.
>
> I firstly find the formula in wiki, than tried to verify the answer in R,
> now, given that 143/12 ((n^2-1)/12 ) is the correct answer for a discrete
> uniform random variable,
> I am s
Hi Rolf Turner ,
God, it directed to the wrong page.
I firstly find the formula in wiki, than tried to verify the answer in R,
now, given that 143/12 ((n^2-1)/12 ) is the correct answer for a discrete
uniform random variable,
I am still not sure what R is calculating there?
why it gives me 13?
On 9/03/2010, at 12:13 PM, casperyc wrote:
>
> Hi all,
>
> I am REALLY confused with the variance right now.
You need to learn the difference
(a) Between sample variance (*estimate* of population variance)
and
population variance.
and
amira akl wrote:
Hello
I am a new user of R software. I benefit from using vrtest-package. However, the codes provided by the aforementioned package, for example, calculate the test statistics for Lo and Mackinlay (1988) under the assumptions of homoscedasticity and heteroscedasticity without c
Simon,That produced exactly what I was looking for. Thanks so much for the
humble help.
KC
On Mon, Jul 13, 2009 at 9:10 AM, Simon Wood wrote:
> You can get some idea by doing something like the following, which compares
> the r^2 for models b and b2, i.e. with and without s(x2). It keeps the
It appears you are conflating beta coefficients (individual covariate
effect measures) with overall model fit measures. Beta coefficients
are not directly comparable to R-squared measures in ordinary least
squares analyses, so why would they be so in gam models?
I cannot tell whether you ac
You can get some idea by doing something like the following, which compares
the r^2 for models b and b2, i.e. with and without s(x2). It keeps the
smoothing parameters fixed for the comparison. (s(x,fx=TRUE) removes
penalization altogether btw, which is not what was wanted).
dat <- gamSim(1,n
Many thanks for the advice David. I would really like to figure out, though,
how to get the contribution of each factor to the Rsq - something like a
Beta coefficient for GAM. Ideas?
KC
On Sun, Jul 12, 2009 at 5:41 PM, David Winsemius wrote:
>
> On Jul 12, 2009, at 5:06 PM, Kayce Anderson wrote
On Jul 12, 2009, at 5:06 PM, Kayce Anderson wrote:
Hi,
I am using mgcv:gam and have developed a model with 5 smoothed
predictors
and one factor.
gam1 <- gam(log.sp~ s(Spr.precip,bs="ts") + s(Win.precip,bs="ts") +
s(
Spr.Tmin,bs="ts") + s(P.sum.Tmin,bs="ts") + s( Win.Tmax,bs="ts")
+fact
Hi,
I am using mgcv:gam and have developed a model with 5 smoothed predictors
and one factor.
gam1 <- gam(log.sp~ s(Spr.precip,bs="ts") + s(Win.precip,bs="ts") + s(
Spr.Tmin,bs="ts") + s(P.sum.Tmin,bs="ts") + s( Win.Tmax,bs="ts")
+factor(site),data=dat3)
The total deviance explained = 70.4%.
On Tue, Jun 2, 2009 at 3:34 PM, Thomas Lumley wrote:
> The answers differ by a factor of 19/20, ie, (n-1)/n, so it is presumably
> the choice of denominator for the variance that differs.
>
Same issue is present in ccf():
cov() != ccf(lag.max=0, type="covariance").
Liviu
The answers differ by a factor of 19/20, ie, (n-1)/n, so it is presumably
the choice of denominator for the variance that differs.
-thomas
On Tue, 2 Jun 2009, Liviu Andronic wrote:
Dear all,
Does this make any sense:
var() = cov() != acf(lag.max=0, type="covariance")?
I have daily
(this post suggests a patch to the sources, so i allow myself to divert
it to r-devel)
Bert Gunter wrote:
> x a numeric vector, matrix or data frame.
> y NULL (default) or a vector, matrix or data frame with compatible
> dimensions to x. The default is equivalent to y = x (but more efficient).
-project.org
Subject: Re: [R] variance/mean
rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]0
Wacek Kusnierczyk wrote:
>
> when you apply var to a single matrix, it will compute covariances
> between its columns rather than the overall variance:
>
> set.seed(0)
> x = matrix(rnorm(4), 2, 2)
>
> var(x)
> #[,1] [,2]
> # [1,] 1.2629543 1.329799
>
rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]000
>
> and
>
> mean(o)
> [1] 2
>
> How do I get mean to return an ar
On 22-Mar-09 08:17:29, rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]000
>
> and
>
> mean(o)
> [1] 2
>
> How do
> "HLS" == Han Lin Shang <[EMAIL PROTECTED]>
> on Sun, 26 Oct 2008 18:02:20 +1100 writes:
HLS> Hi Dear R-users: I am building a R package and would
HLS> like to create a generic variance function. Here is how
HLS> I did
HLS> var=function(x,...) { UseMethod("var") }
Laura Bonnett wrote:
> Here is the exact code I have written which does the standard vs nt1
> and standard vs nt2 and also gives me the hazard ratio for nt1 vs nt2.
>
> with <- read.table("allwiths.txt",
> header=TRUE)
> fix(arm)
> function (data)
> {
> dummy <- rep(0,2437)
> for(i
Here is the exact code I have written which does the standard vs nt1 and
standard vs nt2 and also gives me the hazard ratio for nt1 vs nt2.
with <- read.table("allwiths.txt",header=TRUE)
fix(arm)
function (data)
{
dummy <- rep(0,2437)
for(i in 1:2437){
if(data$Arm[i]=="CBZ")
Laura Bonnett wrote:
> Hi all,
>
> Sorry to ask again but I'm still not sure how to get the full
> variance-covariance matrix. Peter suggested a three-level treatment
> factor. However, I thought that the censoring variable could only take
> values 0 or 1 so how do you programme such a factor.
>
Hi all,
Sorry to ask again but I'm still not sure how to get the full
variance-covariance matrix. Peter suggested a three-level treatment
factor. However, I thought that the censoring variable could only take
values 0 or 1 so how do you programme such a factor.
Alternatively, is there another w
The standard treatment is the same in both comparison.
How do you do a three-level treatment factor?
I thought you had to have a censoring indicator which took values 0 or 1 not
1, 2 or 3?
Thanks,
Laura
On Tue, Aug 26, 2008 at 11:05 AM, Peter Dalgaard
<[EMAIL PROTECTED]>wrote:
> Laura Bonnett
Laura Bonnett wrote:
> Dear R help forum,
>
> I am using the function 'coxph' to obtain hazard ratios for the comparison
> of a standard treatment to new treatments. This is easily obtained by
> fitting the relevant model and then calling exp(coef(fit1)) say.
>
> I now want to obtain the hazard ra
Dear John:
Weir, BS 1996 Genetic Data Analysis II . Sinaur Associates, Sunderland, MA,;
should get you started for methods in population genetics
(otherreferencecouldbetheArlequin'smanual:
[1]http://cmpg.unibe.ch/software/arlequin3/)
However, you are p
mea culpa: I've not written an extractor for this, so you have to do
f <- nlrq(whatever)
g <- summary(f)
g$cov
Note that this is computed by resampling so it varies somewhat
depending on the seed.
url:www.econ.uiuc.edu/~rogerRoger Koenker
email[EM
unfortunately, it is not showing probeID
Henrique Dallazuanna wrote:
>
> Try this:
>
> write.table(cbind(data.matrix[1], Variance = apply(data.matrix[,-1],
> 1, var)),file='file.xls')
>
>
> On 02/03/2008, Keizer_71 <[EMAIL PROTECTED]> wrote:
>>
>> sorry...in step 4-i need the R code to
Then you can try:
rownames(data.matrix) <- as.character(data.matrix$ProbeID)
data.matrix <- data.matrix[-1]
as.matrix(apply(data.matrix1, 1, var))
or
out <- apply(data.matrix1, 1, var)
data.frame(ProbeID = names(out), Variance = unname(out))
Works for me
On 02/03/2008, Keizer_71 <[EMAIL PROT
Hi Henrique,
It is definitely better, but it doesn't show me the ProbeID which identify
the probes name
Here was the result when i export to excel with your code.
"Variance"
1 2.425509867 21.6216446425273
any suggestions?
thanks,
Kei
Keizer_71 wrote:
>
> Hello,
>
> Thanks everyon
Try this:
write.table(cbind(data.matrix[1], Variance = apply(data.matrix[,-1],
1, var)),file='file.xls')
On 02/03/2008, Keizer_71 <[EMAIL PROTECTED]> wrote:
>
> sorry...in step 4-i need the R code to output in this format when i export
> to
> excel.
>
> ProbeID Variance
> 1
sorry...in step 4-i need the R code to output in this format when i export to
excel.
ProbeID Variance
1 224588_at 21.58257457
thanks
Keizer_71 wrote:
>
> Hello,
>
> Thanks everyone for helping me with the previous queries.
>
> step 1: Here is the orginal data: shor
Dear Prof. Wood,
Just another quick question. I am doing model selection following Wood
and Augustin (2002). One of the criteria for retaining a term is to see
if removing it causes an increase in the GCV score. When doing this, do
I also need to fix the smooth parameters?
Thanks,
Julian B
Thanks again for your answer, prof. Wood.
And my apologies for the list for my repeated message from yesterday.
Still trying to figure out what happened with my email software.
Julian
Simon Wood wrote:
> I think that your approach is reasonable, except that you should use the same
> smoothing
I think that your approach is reasonable, except that you should use the same
smoothing parameters throughout. i.e the reduced models should use the same
smoothing parameters as the full model. Otherwise you get in trouble if x1
and x2 are correlated, since the smoothing parameters will then ten
81 matches
Mail list logo