A general family of trimmed estimators for robust high-dimensional data analysis
DOI10.1214/18-EJS1470zbMath1496.62121arXiv1605.08299OpenAlexW2750727485WikidataQ129057076 ScholiaQ129057076MaRDI QIDQ1616324
Eunho Yang, Aurélie C. Lozano, Aleksandr Y. Aravkin
Publication date: 1 November 2018
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.08299
Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of statistics to biology and medical sciences; meta analysis (62P10) Robustness and adaptive procedures (parametric inference) (62F35) Probabilistic graphical models (62H22)
Related Items (9)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- Sparse recovery under matrix uncertainty
- Robust graphical modeling of gene networks using classical and alternative \(t\)-distributions
- Introductory lectures on convex optimization. A basic course.
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the conditions used to prove oracle results for the Lasso
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Robust regression through the Huber's criterion and adaptive lasso penalty
- High-dimensional graphs and variable selection with the Lasso
- High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis
- Robust Lasso With Missing and Grossly Corrupted Observations
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Estimating nuisance parameters in inverse problems
- Least Median of Squares Regression
- Model selection and estimation in the Gaussian graphical model
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Separable nonlinear least squares: the variable projection method and its applications
- Robust Gaussian Graphical Modeling Via l1 Penalization
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Beitrag zur Theorie des Ferromagnetismus
- Stable signal recovery from incomplete and inaccurate measurements
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: A general family of trimmed estimators for robust high-dimensional data analysis