Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
From MaRDI portal
Publication:391843
DOI10.1214/13-EJS868zbMath1280.62086arXiv1205.0953MaRDI QIDQ391843
Publication date: 13 January 2014
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1205.0953
random matricespersistencedeconvolutionconvex geometryseparating hyperplanesnon-negativity constraints
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Linear regression; mixed models (62J05)
Related Items
Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling, Two-level structural sparsity regularization for identifying lattices and defects in noisy images, A component lasso, Nonparametric estimation of the random coefficients model: an elastic net approach, Iteratively reweighted adaptive Lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes, Nonnegative self-representation with a fixed rank constraint for subspace clustering, Nonnegative estimation and variable selection via adaptive elastic-net for high-dimensional data, Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization, Sparse Topic Modeling: Computational Efficiency, Near-Optimal Algorithms, and Statistical Inference, ESTIMATION OF A HIGH-DIMENSIONAL COUNTING PROCESS WITHOUT PENALTY FOR HIGH-FREQUENCY EVENTS, Efficient sparse portfolios based on composite quantile regression for high-dimensional index tracking, A successive difference-of-convex approximation method for a class of nonconvex nonsmooth optimization problems, On multi-modal fusion learning in constraint propagation, Sparse support recovery using correlation information in the presence of additive noise, High-dimensional sparse portfolio selection with nonnegative constraint, Sign-constrained least squares estimation for high-dimensional regression, Algebraic cubature on polygonal elements with a circular edge, Non-negatively constrained least squares and parameter choice by the residual periodogram for the inversion of electrochemical impedance spectroscopy data, Globally Convergent Type-I Anderson Acceleration for Nonsmooth Fixed-Point Iterations, Nonparametric shape-restricted regression, Reconstruction Methods in THz Single-Pixel Imaging, The geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radii, Bayesian inference for generalized linear model with linear inequality constraints, Sparse solution of nonnegative least squares problems with applications in the construction of probabilistic Boolean networks, High-dimensional sign-constrained feature selection and grouping, Compressed algebraic cubature over polygons with applications to optical design, Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations, Safe feature elimination for non-negativity constrained convex optimization, Unnamed Item, Hyperbolic Wavelet Frames and Multiresolution in the Weighted Bergman Spaces, Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models, Sparse Inverse Problems over Measures: Equivalence of the Conditional Gradient and Exchange Methods, Provably optimal sparse solutions to overdetermined linear systems with non-negativity constraints in a least-squares sense by implicit enumeration, Angular scattering function estimation using deep neural networks
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Graph Selection with GGMselect
- The Adaptive Lasso and Its Oracle Properties
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Statistics for high-dimensional data. Methods, theory and applications.
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Near-ideal model selection by \(\ell _{1}\) minimization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- SPADES and mixture models
- Lasso-type recovery of sparse representations for high-dimensional data
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Parametric deconvolution of positive spike trains.
- Least angle regression. (With discussion)
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the conditions used to prove oracle results for the Lasso
- Sign-constrained least squares estimation for high-dimensional regression
- Counting the faces of randomly-projected hypercubes and orthants, with applications
- Banach-Mazur distances and projections on random subgaussian polytopes
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Smallest singular value of random matrices and geometry of random polytopes
- High-dimensional graphs and variable selection with the Lasso
- Reconstruction From Anisotropic Random Measurements
- How Correlations Influence Lasso Prediction
- Non-asymptotic theory of random matrices: extreme singular values
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Stochastic and Integral Geometry
- Greed is Good: Algorithmic Results for Sparse Approximation
- On the Uniqueness of Nonnegative Sparse Solutions to Underdetermined Systems of Equations
- A Problem in Geometric Probability.
- A Unique “Nonnegative” Solution to an Underdetermined System: From Vectors to Matrices
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach
- Sparse nonnegative solution of underdetermined linear equations by linear programming
- Neighborliness of randomly projected simplices in high dimensions
- Covariance-Preconditioned Iterative Methods for Nonnegatively Constrained Astronomical Imaging
- Compressed sensing