首页|Semiparametric partial common principal component analysis for covariance matrices
Semiparametric partial common principal component analysis for covariance matrices
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NSTL
Wiley
Abstract We consider the problem of jointly modeling multiple covariance matrices by partial common principal component analysis (PCPCA), which assumes a proportion of eigenvectors to be shared across covariance matrices and the rest to be individual‐specific. This paper proposes consistent estimators of the shared eigenvectors in the PCPCA as the number of matrices or the number of samples to estimate each matrix goes to infinity. We prove such asymptotic results without making any assumptions on the ranks of eigenvalues that are associated with the shared eigenvectors. When the number of samples goes to infinity, our results do not require the data to be Gaussian distributed. Furthermore, this paper introduces a sequential testing procedure to identify the number of shared eigenvectors in the PCPCA. In simulation studies, our method shows higher accuracy in estimating the shared eigenvectors than competing methods. Applied to a motor‐task functional magnetic resonance imaging data set, our estimator identifies meaningful brain networks that are consistent with current scientific understandings of motor networks during a motor?paradigm.
consistencypartial common principle componentssemiparametricsequential testing
Xi Luo、Bingkai Wang、Yi Zhao、Brian Caffo
展开 >
The University of Texas,Health Science Center at Houston School of Public Health