Hi R Team

I've got the following problem

I'd like to run a time series regression of the following form

Regression1:

At = α + β1 * Bt + β2 * Bt-1 + β3 [(Bt-2 + Bt-3 + Bt-4)/3] + εt

The B's are the input values and the A's are the output values, the
subscript stands for the lag.
The real Beta of this regression is βreal = β1 + β2 + β3

First: How can I run the regression without manually laging the B's?
And second: I need the standard error for βreal. How can I calculate it with
the information given from the lm(Regression1)? (I read something about the
deltamethod?)

Thank you a lot!
Kind regards




--
View this message in context: 
http://r.789695.n4.nabble.com/Time-lag-Regression-and-Standard-Error-tp4675130.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to