A Generalized Mean Approach for Distributed-PCA
Principal component analysis (PCA) is a widely used technique for dimension reduction. As datasets continue to grow in size, distributed-PCA (DPCA) has become an active research area. A key challenge in DPCA lies in efficiently aggregating results across multiple machines or computing nodes due to computational overhead. Fan et al. (2019) introduced a pioneering DPCA method to estimate the leading rank- eigenspace, aggregating local rank- projection matrices by averaging. However, their method does not utilize eigenvalue information. In this article, we propose a novel DPCA method that incorporates eigenvalue information to aggregate local results via the matrix -mean, which we call -DPCA. The matrix -mean offers a flexible and robust aggregation method through the adjustable choice of values. Notably, for , it corresponds to the arithmetic mean; for , the harmonic mean; and as , the geometric mean. Moreover, the matrix -mean is shown to associate with the matrix -divergence, a subclass of the Bregman matrix divergence, to support the robustness of -DPCA. We also study the stability of eigenvector ordering under eigenvalue perturbation for -DPCA. The performance of our proposal is evaluated through numerical studies.
View on arXiv