Optimal discriminant analysis in high-dimensional latent factor models

From MaRDI portal
Publication:6136589

DOI10.1214/23-AOS2289arXiv2210.12862MaRDI QIDQ6136589FDOQ6136589

Marten H. Wegkamp, Xin Bing

Publication date: 31 August 2023

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: In high-dimensional classification problems, a commonly used approach is to first project the high-dimensional features into a lower dimensional space, and base the classification on the resulting lower dimensional projections. In this paper, we formulate a latent-variable model with a hidden low-dimensional structure to justify this two-step procedure and to guide which projection to choose. We propose a computationally efficient classifier that takes certain principal components (PCs) of the observed features as projections, with the number of retained PCs selected in a data-driven way. A general theory is established for analyzing such two-step classifiers based on any projections. We derive explicit rates of convergence of the excess risk of the proposed PC-based classifier. The obtained rates are further shown to be optimal up to logarithmic factors in the minimax sense. Our theory allows the lower-dimension to grow with the sample size and is also valid even when the feature dimension (greatly) exceeds the sample size. Extensive simulations corroborate our theoretical findings. The proposed method also performs favorably relative to other existing discriminant methods on three real data examples.


Full work available at URL: https://arxiv.org/abs/2210.12862




Recommendations




Cites Work


Cited In (3)





This page was built for publication: Optimal discriminant analysis in high-dimensional latent factor models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136589)