Balanced estimation for high-dimensional measurement error models
From MaRDI portal
Publication:1663093
DOI10.1016/j.csda.2018.04.009zbMath1469.62183OpenAlexW2799951999MaRDI QIDQ1663093
Yang Li, Chongxiu Yu, Zemin Zheng, Gao Rong Li
Publication date: 21 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2018.04.009
model selectionhigh dimensionalitymeasurement errorsbalanced estimationcombined \(L_1\) and concave regularizationnearest positive semi-definite projection
Computational methods for problems pertaining to statistics (62-08) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Sampling theory, sample surveys (62D05)
Related Items
Correlation coefficient-based measure for checking symmetry or asymmetry of a continuous variable with additive distortion, Screening Methods for Linear Errors-in-Variables Models in High Dimensions, L 0 -regularization for high-dimensional regression with corrupted data, Scalable interpretable learning for multi-response error-in-variables regression, Logarithmic calibration for nonparametric multiplicative distortion measurement errors models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Variable selection in measurement error models
- Sparse recovery under matrix uncertainty
- CoCoLasso for high-dimensional error-in-variables regression
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- One-step sparse estimates in nonconcave penalized likelihood models
- Least angle regression. (With discussion)
- Network exploration via the adaptive LASSO and SCAD penalties
- Simultaneous analysis of Lasso and Dantzig selector
- On the adaptive elastic net with a diverging number of parameters
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Strong oracle optimality of folded concave penalized estimation
- Scaled sparse linear regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable Selection for Partially Linear Models With Measurement Errors
- Regularization and Variable Selection Via the Elastic Net
- Asymptotic properties for combined L1 and concave regularization
- High Dimensional Thresholded Regression and Shrinkage Effect
- Sequential profile Lasso for ultra-high-dimensional partially linear models
- A general theory of concave regularization for high-dimensional sparse estimation problems