Join us for the Quantitative Psychology Colloquium with Yoshikazu Terada (Associate Professor, Division of Mathematical Science, Osaka University)
This event is online only. Please join us using this meeting link.
Title: Statistical properties of matrix decomposition factor analysis
Abstract: Numerous estimators have been proposed for factor analysis, and their statistical properties have been extensively studied. In the early 2000s, a novel matrix factorization-based approach, known as Matrix Decomposition Factor Analysis (MDFA), was introduced and has been actively developed in computational statistics. The MDFA estimator offers several advantages, including computational stability and the extensibility to sparse estimation. However, the MDFA estimator does not appear to be formulated as a classical M-estimator or a minimum discrepancy function (MDF) estimator, and the statistical properties of the MDFA estimator have remained largely unexplored. Although the MDFA estimator minimizes a loss function resembling that of principal component analysis (PCA), it empirically behaves more like consistent estimators used in factor analysis than like PCA itself. This raises a fundamental question: can matrix decomposition factor analysis truly be regarded as "factor analysis"? To address this issue, we establish consistency and asymptotic normality of the MDFA estimator. Recognizing that the MDFA estimator can be formulated as a semiparametric maximum likelihood estimator, we surprisingly find that the profile likelihood is given by the squared Bures-Wasserstein distance between the sample covariance matrix and the modeled covariance matrix. As a consequence, the MDFA estimator is ultimately an MDF estimator for factor analysis. Beyond MDFA, the same representation holds for a broad class of component analysis methods, including PCA, thereby offering a unified perspective on component analysis. Numerical experiments demonstrate that MDFA performs competitively with other established estimators, suggesting that it is a theoretically grounded and computationally appealing alternative for factor analysis.
About Yoshikazu Terada: Yoshikazu Terada is an Associate Professor at the University of Osaka. He received his Ph.D. in Science from the University of Osaka in 2014. His research covers a broad range of topics in statistics and machine learning, with a particular emphasis on the statistical theory of deep learning and unsupervised learning methods such as clustering and factor analysis. He has published in leading journals and conferences, including Annals of Statistics, Annals of Applied Statistics, Bernoulli, Psychometrika, and ICML. He also serves as an Associate Editor for the Annals of the Institute of Statistical Mathematics and Japanese Journal of Statistics and Data Science, and is a Board Member of the Asian Regional Section (ARS) of the International Association for Statistical Computing (IASC).