A lava attack on the recovery of sums of dense and sparse signals
From MaRDI portal
Publication:125382
DOI10.1214/16-AOS1434zbMATH Open1422.62248arXiv1502.03155MaRDI QIDQ125382FDOQ125382
Authors: Victor Chernozhukov, Christian Hansen, Yuan Liao, Yong-Cai Geng, Sumit K. Garg
Publication date: 1 February 2017
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: Common high-dimensional methods for prediction rely on having either a sparse signal model, a model in which most parameters are zero and there are a small number of non-zero parameters that are large in magnitude, or a dense signal model, a model with no large parameters and very many small non-zero parameters. We consider a generalization of these two basic models, termed here a "sparse+dense" model, in which the signal is given by the sum of a sparse signal and a dense signal. Such a structure poses problems for traditional sparse estimators, such as the lasso, and for traditional dense estimation methods, such as ridge estimation. We propose a new penalization-based method, called lava, which is computationally efficient. With suitable choices of penalty parameters, the proposed method strictly dominates both lasso and ridge. We derive analytic expressions for the finite-sample risk function of the lava estimator in the Gaussian sequence model. We also provide an deviation bound for the prediction risk in the Gaussian regression model with fixed design. In both cases, we provide Stein's unbiased estimator for lava's prediction risk. A simulation example compares the performance of lava to lasso, ridge, and elastic net in a regression example using feasible, data-dependent penalty parameters and illustrates lava's improved performance relative to these benchmarks.
Full work available at URL: https://arxiv.org/abs/1502.03155
Recommendations
Cited In (24)
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- Recovery of partly sparse and dense signals
- Recovery of sums of sparse and dense signals by incorporating graphical structure among predictors
- An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls
- Deconfounding and Causal Regularisation for Stability and External Validity
- Adaptive estimation in multivariate response regression with hidden variables
- High-dimensional latent panel quantile regression with an application to asset pricing
- Simultaneous spatial smoothing and outlier detection using penalized regression, with application to childhood obesity surveillance from electronic health records
- Factor Augmented Inverse Regression and its Application to Microbiome Data Analysis
- Inference in High-Dimensional Multivariate Response Regression with Hidden Variables
- STRETCHING THE NET: MULTIDIMENSIONAL REGULARIZATION
- Lasso meets horseshoe: a survey
- Detecting Abrupt Changes in the Presence of Local Fluctuations and Autocorrelated Noise
- A Decorrelating and Debiasing Approach to Simultaneous Inference for High-Dimensional Confounded Models
- Lavash
- LavaCvxr
- Linear discriminant analysis with sparse and dense signals
- Title not available (Why is that?)
- Significance testing in non-sparse high-dimensional linear models
- Doubly debiased Lasso: high-dimensional inference under hidden confounding
- Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models
- Identifying Effects of Multiple Treatments in the Presence of Unmeasured Confounding
- A bootstrap Lasso+partial ridge method to construct confidence intervals for parameters in high-dimensional sparse linear models
- Estimation of graphical models through structured norm minimization
This page was built for publication: A lava attack on the recovery of sums of dense and sparse signals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q125382)