Hard thresholding regression
From MaRDI portal
Publication:4629285
DOI10.1111/SJOS.12353zbMATH Open1417.62197OpenAlexW2892740843WikidataQ100725670 ScholiaQ100725670MaRDI QIDQ4629285FDOQ4629285
Authors: Qiang Sun, Bai Jiang, Joseph G. Ibrahim, Hongtu Zhu
Publication date: 21 March 2019
Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7558802
Recommendations
- High Dimensional Thresholded Regression and Shrinkage Effect
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- Hard thresholding regularised logistic regression: theory and algorithms
- Distributional results for thresholding estimators in high-dimensional Gaussian regression models
- Adaptive Lasso for sparse high-dimensional regression models
Nonparametric estimation (62G05) Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Covariance-regularized regression and classification for high dimensional problems
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- One-step sparse estimates in nonconcave penalized likelihood models
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Analysis of multi-stage convex relaxation for sparse regularization
- Restricted eigenvalue properties for correlated Gaussian designs
- Title not available (Why is that?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Adaptive Lasso for sparse high-dimensional regression models
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- Estimation and selection via absolute penalized convex minimization and its multistage adaptive applications
- Smoothly clipped absolute deviation on high dimensions
- Exploration, normalization, and summaries of high density oligonucleotide array probe level data
- Calibrating nonconvex penalized regression in ultra-high dimension
- Strong oracle optimality of folded concave penalized estimation
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- High Dimensional Thresholded Regression and Shrinkage Effect
- When do stepwise algorithms meet subset selection criteria?
- The constrained Dantzig selector with enhanced consistency
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
Cited In (11)
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis
- Canonical thresholding for nonsparse high-dimensional linear regression
- Hard thresholding regularised logistic regression: theory and algorithms
- On oracle inequalities related to data-driven hard thresholding
- Thresholding least-squares inference in high-dimensional regression models
- Distributional results for thresholding estimators in high-dimensional Gaussian regression models
- Thresholding-based iterative selection procedures for model selection and shrinkage
- Title not available (Why is that?)
- Variable Selection With Second-Generation P-Values
- Truncated \(L_1\) regularized linear regression: theory and algorithm
Uses Software
This page was built for publication: Hard thresholding regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4629285)