Thank you for your comment.
I forgot to mention that varclus and pvclust showed similar results for
my data.
BTW, I did not realize rms is a replacement for the Design package.
I appreciate your suggestion.
--
KH
(11/04/21 8:00), Frank Harrell wrote:
I think it's OK. You can also use the Hmi
I think it's OK. You can also use the Hmisc package's varclus function.
Frank
細田弘吉 wrote:
>
> Dear Prof. Harrel,
>
> Thank you very much for your quick advice.
> I will try rms package.
>
> Regarding model reduction, is my model 2 method (clustering and recoding
> that are blinded to the out
Dear Prof. Harrel,
Thank you very much for your quick advice.
I will try rms package.
Regarding model reduction, is my model 2 method (clustering and recoding
that are blinded to the outcome) permissible?
Sincerely,
--
KH
(11/04/20 22:01), Frank Harrell wrote:
Deleting variables is a bad i
Deleting variables is a bad idea unless you make that a formal part of the
BMA so that the attempt to delete variables is penalized for. Instead of
BMA I recommend simple penalized maximum likelihood estimation (see the lrm
function in the rms package) or pre-modeling data reduction that is blinde
Hi everybody,
I apologize for long mail in advance.
I have data of 104 patients, which consists of 15 explanatory variables
and one binary outcome (poor/good). The outcome consists of 25 poor
results and 79 good results. I tried to analyze the data with logistic
regression. However, the 15 variabl
5 matches
Mail list logo