Balanced estimation for high-dimensional measurement error models
DOI10.1016/J.CSDA.2018.04.009zbMATH Open1469.62183OpenAlexW2799951999WikidataQ129897828 ScholiaQ129897828MaRDI QIDQ1663093FDOQ1663093
Gaorong Li, Yang Li, Chongxiu Yu, Zemin Zheng
Publication date: 21 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2018.04.009
Recommendations
- Variable selection in measurement error models
- Weighted \(\ell_1\)-penalized corrected quantile regression for high dimensional measurement error models
- CoCoLasso for high-dimensional error-in-variables regression
- Simulation-selection-extrapolation: estimation in high-dimensional errors-in-variables models
- Covariate Selection for Linear Errors-in-Variables Regression Models
model selectionhigh dimensionalitymeasurement errorsbalanced estimationcombined \(L_1\) and concave regularizationnearest positive semi-definite projection
Computational methods for problems pertaining to statistics (62-08) Linear regression; mixed models (62J05) Sampling theory, sample surveys (62D05) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- One-step sparse estimates in nonconcave penalized likelihood models
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Variable Selection for Partially Linear Models With Measurement Errors
- Regularization and Variable Selection Via the Elastic Net
- Variable selection in measurement error models
- Sparse recovery under matrix uncertainty
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Network exploration via the adaptive LASSO and SCAD penalties
- Scaled sparse linear regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Asymptotic properties for combined L1 and concave regularization
- Title not available (Why is that?)
- On the adaptive elastic net with a diverging number of parameters
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Title not available (Why is that?)
- Strong oracle optimality of folded concave penalized estimation
- Title not available (Why is that?)
- High Dimensional Thresholded Regression and Shrinkage Effect
- CoCoLasso for high-dimensional error-in-variables regression
- The constrained Dantzig selector with enhanced consistency
- Sequential profile Lasso for ultra-high-dimensional partially linear models
Cited In (7)
- Logarithmic calibration for nonparametric multiplicative distortion measurement errors models
- Screening Methods for Linear Errors-in-Variables Models in High Dimensions
- L 0 -regularization for high-dimensional regression with corrupted data
- Estimation of mean response via the effective balancing score
- Scalable interpretable learning for multi-response error-in-variables regression
- Correlation coefficient-based measure for checking symmetry or asymmetry of a continuous variable with additive distortion
- Statistical identification with error balancing
Uses Software
This page was built for publication: Balanced estimation for high-dimensional measurement error models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1663093)