On Jul 12, 2014, at 4:25 AM, Kevin Kunzmann wrote: > Hi, > > I am currently trying to build a regression model for calibration of HPLC > outputs. I decided to use a multiplicative error model: > > Y_i = (a*X_i + b)*eps_i > > where the eps_i ~ iid N(0, s^2). Now I am having a hard time estimating my > parameters ;) So the idea was to apply log() to both sides: > > Z_i = log(Y_i) = log(a*X_i + b) + log(eps_i) > > Now the additive errors are lognormally distributed and I could formulate > this as a GLM > > Z_i = g^(-1)(a*X_i + b) + iota_i > > where iota_i are lognormal and the link function g(x) is exp(x) as g^(-1) = > log. So wouldn't the corresponding call for R have to be something like: > > glm(z ~ x, data=data.frame(x=x, z=log(y)), family=lognormal(link='exp')) > > this however is not working (there is no lognormal family and no exp link > function ^^. How do I estimate those parameters? This seems to be a pretty > standard problem to me...
I would have imagined that this would do what was described in the beginning: glm( y ~ x, data=data.frame(x,y), family="quasipoisson") # family="poisson" would choke on non-integer y. Link is `log` by default for this family and the model is multiplicative with poisson errors. It's admittedly not exactly log-normal errors, but should be sufficiently similar. -- David Winsemius Alameda, CA, USA ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.