Hello! I suppose this is more a matrix theory question than a question on R, but I will give it a try...
I am using La.svd to compute the singular value decomposition (SVD) of a variance matrix, i.e., a symmetric nonnegative definite square matrix. Let S be my variance matrix, and S = U D V' be its SVD. In my numerical experiments I always got U = V. Is this necessarily the case? Or I might eventually run into a SVD which has U != V? Thank you in advance for your insights and pointers. Giovanni -- Giovanni Petris <[EMAIL PROTECTED]> Associate Professor Department of Mathematical Sciences University of Arkansas - Fayetteville, AR 72701 Ph: (479) 575-6324, 575-8630 (fax) http://definetti.uark.edu/~gpetris/ ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.