Penalised robust estimators for sparse and high-dimensional linear models
From MaRDI portal
Publication:2664993
Recommendations
- Nonconcave penalized M-estimation with a diverging number of parameters
- M-estimation in high-dimensional linear model
- Robust and sparse estimators for linear regression models
- Robust sparse regression with high-breakdown value
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
Cites work
- scientific article; zbMATH DE number 3829050 (Why is no real title available?)
- scientific article; zbMATH DE number 3905646 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3320125 (Why is no real title available?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Adaptive Lasso for sparse high-dimensional regression models
- Asymptotic behavior of general M-estimates for regression and scale with random carriers
- Asymptotics for Lasso-type estimators.
- Comments on: \(\ell _{1}\)-penalization for mixture regression models
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Consistency and robustness properties of the S-nonnegative garrote estimator
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Fully efficient robust estimation, outlier detection, and variable selection via penalized regression
- Hedonic housing prices and the demand for clean air
- High-dimensional generalized linear models and the lasso
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Introduction to robust and quasi-robust statistical methods
- Lasso-type recovery of sparse representations for high-dimensional data
- Least angle regression. (With discussion)
- Nearly unbiased variable selection under minimax concave penalty
- New robust variable selection methods for linear regression models
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Outlier detection using nonconvex penalized regression
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Penalized likelihood regression for generalized linear models with non-quadratic penalties
- Penalized partially linear models using sparse representations with an application to fMRI time series
- Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh
- Regularization and Variable Selection Via the Elastic Net
- Regularization of Wavelet Approximations
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- Robust Estimation of a Location Parameter
- Robust Linear Model Selection Based on Least Angle Regression
- Robust Statistics
- Robust and consistent variable selection in high-dimensional generalized linear models
- Robust and sparse estimators for linear regression models
- Robust elastic net estimators for variable selection and identification of proteomic biomarkers
- Robust nonnegative garrote variable selection in linear regression
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Scaled sparse linear regression
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Sparsity oracle inequalities for the Lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Techniques for nonlinear least squares and robust regression
- The Adaptive Lasso and Its Oracle Properties
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- The power of monitoring: how to make the most of a contaminated multivariate sample
- Unified LASSO Estimation by Least Squares Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Wavelet estimation of partially linear models
- Wavelet methods in statistics: some recent developments and their applications
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- \(\ell_{1}\)-penalization for mixture regression models
Cited in
(31)- Penalized wavelet nonparametric univariate logistic regression for irregular spaced data
- Generalized thresholding estimators for high-dimensional location parameters
- Robust and sparse estimators for linear regression models
- Robust sparse regression with high-breakdown value
- Simultaneous feature selection and outlier detection with optimality guarantees
- Large-scale regression with non-convex loss and penalty
- A general family of trimmed estimators for robust high-dimensional data analysis
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- On robust regression with high-dimensional predictors
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- A tuning-free robust and efficient approach to high-dimensional regression
- Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator
- Influence functions for penalized M-estimators
- Nonconcave penalized M-estimation with a diverging number of parameters
- Iteratively reweighted \(\ell_1\)-penalized robust regression
- Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
- Robust statistical boosting with quantile-based adaptive loss functions
- M-estimation in high-dimensional linear model
- Penalized robust estimators in sparse logistic regression
- Approximate separability of symmetrically penalized least squares in high dimensions: characterization and consequences
- Wavelet-based robust estimation and variable selection in nonparametric additive models
- A method of least absolute deviation estimator with adaptive weighted penalty
- Bootstrap estimation of the proportion of outliers in robust regression
- Robustness in sparse high-dimensional linear models: relative efficiency and robust approximate message passing
- Robust subset selection
- Penalized wavelet estimation and robust denoising for irregular spaced data
- Robust elastic net estimators for variable selection and identification of proteomic biomarkers
- The influence function of penalized regression estimators
- M-estimation and model identification based on double SCAD penalization
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Sparse linear regression models of high dimensional covariates with non-Gaussian outliers and Berkson error-in-variable under heteroscedasticity
This page was built for publication: Penalised robust estimators for sparse and high-dimensional linear models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2664993)