You can do this using the predict.gam(...,type="lpmatrix"). Here's some code providing an example....
## example of smooths conditioned on factors from ?gam.models library(mgcv) dat <- gamSim(4) ## fit model... b <- gam(y ~ fac+s(x2,by=fac)+s(x0),data=dat) plot(b,pages=1) ## now predict on x2 grid at 2 levels of `fac'... x2 <- seq(0,1,length=100);x3 <- c(x2,x2) newd <- data.frame(x2=x3,x0=rep(.5,100),fac=c(rep("1",100),rep("2",100))) Xp <- predict(b,newd,type="lpmatrix") X <- Xp[1:100,]-Xp[101:200,] X[,1:3] <- 0;X[,22:39] <- 0 ## Now X%*%coef(b) gives difference between smooths for fac=="1" and ## fac=="2"... md <- X%*%coef(b) ## following evaluates diag(X%*%vcov(b)%*%t(X)) the s.e. of ## difference just computed.... se <- rowSums((X%*%vcov(b))*X) ## So getting upper and lower confidence limits is easy ## (could use qt(.975,b$df.residual) in place of 2 below) ul <- md + 2 * se;ll <- md - 2 * se ## plot smooths and difference par(mfrow=c(2,2));plot(b,select=1);plot(b,select=2) plot(x2,md,ylim=range(c(ul,ll)),type="l") lines(x2,ul,col=2);lines(x2,ll,col=2) On Thursday 05 August 2010 17:49, Mike Lawrence wrote: > Hi folks, > > I originally tried R-SIG-Mixed-Models for this one > (https://stat.ethz.ch/pipermail/r-sig-mixed-models/2010q3/004170.html), > but I think that the final steps to a solution aren't mixed-model > specific, so I thought I'd ask my final questions here. > > I used gamm4 to fit a generalized additive mixed model to data from a > AxBxC design, where A is a random effect (human participants in an > experiment), B is a 2-level factor predictor variable, and C is a > continuous variable that is likely non-linear. I tell gamm4 to fit a > smooth across C to each level of B independently, and I can use > predict.gam(...,se.fit=T) to obtain predictions from the fitted model > as well as the standard error for the prediction. I'd like to > visualize the BxC interaction to see if smoothing C within each level > of B was really necessary, and if so, where it is (along the C > dimension) that B affects the smooth. It's easy enough to obtain the > predicted B1-B2 difference function, but I'm stuck on how to convey > the uncertainty of this function (e.g. computing the confidence > interval of the difference at each value of C). > > One thought is that predict.gam(...,se.fit=T) returns SE values, so if > I could find out the N on which these SE values are computed, I could > compute the difference CI as > sqrt( ( (SD_B1)^2 + (SD_B2)^2 ) / N ) * qt( .975, df=N-1 ) > > However, I can't seem to figure out what value of N was used to > compute the SEs that predict.gam(...,se.fit=T) produces. Can anyone > point me to where I might find N? > > Further, is N-1 the proper df for the call to qt()? > > Finally, with a smooth function and 95% confidence intervals computed > at each of a large number of points, don't I run into a problem of an > inflated Type I error rate? Or does the fact that each point is not > independent from those next to it make this an inappropriate concern? > > Cheers, > > Mike -- > Simon Wood, Mathematical Sciences, University of Bath, Bath, BA2 7AY UK > +44 1225 386603 www.maths.bath.ac.uk/~sw283 ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.