Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis
From MaRDI portal
Publication:6154026
Abstract: It is of importance to develop statistical techniques to analyze high-dimensional data in the presence of both complex dependence and possible outliers in real-world applications such as imaging data analyses. We propose a new robust high-dimensional regression with coefficient thresholding, in which an efficient nonconvex estimation procedure is proposed through a thresholding function and the robust Huber loss. The proposed regularization method accounts for complex dependence structures in predictors and is robust against outliers in outcomes. Theoretically, we analyze rigorously the landscape of the population and empirical risk functions for the proposed method. The fine landscape enables us to establish both {statistical consistency and computational convergence} under the high-dimensional setting. The finite-sample properties of the proposed method are examined by extensive simulation studies. An illustration of real-world application concerns a scalar-on-image regression analysis for an association of psychiatric disorder measured by the general factor of psychopathology with features extracted from the task functional magnetic resonance imaging data in the Adolescent Brain Cognitive Development study.
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A general theory of concave regularization for high-dimensional sparse estimation problems
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space
- Gradient methods for minimizing composite functions
- Hard thresholding regression
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- Model Selection and Estimation in Regression with Grouped Variables
- Nearly unbiased variable selection under minimax concave penalty
- On robust regression with high-dimensional predictors
- One-step sparse estimates in nonconcave penalized likelihood models
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Phase retrieval via Wirtinger flow: theory and algorithms
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Risk bounds for model selection via penalization
- Robust Estimation of a Location Parameter
- Scalar-on-image regression via the soft-thresholded Gaussian process
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Spatial Bayesian variable selection and grouping for high-dimensional scalar-on-image regression
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Strong oracle optimality of folded concave penalized estimation
- Support recovery without incoherence: a case for nonconvex regularization
- The Adaptive Lasso and Its Oracle Properties
- The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions
- The landscape of empirical risk for nonconvex losses
- The statistical analysis of fMRI data
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
This page was built for publication: Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6154026)