Dear list, sorry for asking a more statistical question. I have been reading on penalyzing estimates of regression and bootstrapping regression, trying to include both or either in my analysis. But it is not clear to me (mainly due to my non-statistics background) whether they aim to do similar things in the case of regression, i.e. to get robust estimates, or there is a completely different goal to applying them. Reading and using "logistf" and "penalized" packages for penalization and "boot" and "bootstrap" and John Fox´s append on bootstrapping, I think penalization aims at the coefficients, but bootstrapping aims at standard errors and CIs.
Can anyone tell me whether I am far from understanding both concepts? Thanks in advance David ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.