Diagonally Dominant Principal Component Analysis Zheng Tracy Ke Lingzhou Xue Fan Yang 10.6084/m9.figshare.11593977.v2 https://tandf.figshare.com/articles/dataset/Diagonally-Dominant_Principal_Component_Analysis/11593977 <p>We consider the problem of decomposing a large covariance matrix into the sum of a low-rank matrix and a diagonally dominant matrix, and we call this problem the “diagonally dominant principal component analysis (DD-PCA).” DD-PCA is an effective tool for designing statistical methods for strongly correlated data. We showcase the use of DD-PCA in two statistical problems: covariance matrix estimation and global detection in multiple testing. Using the output of DD-PCA, we propose a new estimator for estimating a large covariance matrix with factor structure. Thanks to a nice property of diagonally dominant matrices, this estimator enjoys the advantage of simultaneous good estimation of the covariance matrix and the precision matrix (by a plain inversion). A plug-in of this estimator to linear discriminant analysis and portfolio optimization yields appealing performance in real data. We also propose two new tests for testing the global null hypothesis in multiple testing when the <i>z</i>-scores have a factor covariance structure. Both tests first use DD-PCA to adjust the individual <i>p</i>-values and then plug in the adjusted <i>p</i>-values to the higher criticism (HC) test. These new tests significantly improve over the HC test and compare favorably with other existing tests. For computation of DD-PCA, we propose an iterative projection algorithm and an ADMM algorithm. <a href="https://doi.org/10.1080/10618600.2020.1713798" target="_blank">Supplementary materials</a> for this article are available online.</p> 2020-02-19 18:01:16 Alternating projection Approximate factor model Covariance matrix estimation Decorrelation Higher criticism POET