The Lasso problem and uniqueness

From MaRDI portal
Publication:1951165


DOI10.1214/13-EJS815zbMath1337.62173arXiv1206.0313MaRDI QIDQ1951165

Ryan J. Tibshirani

Publication date: 29 May 2013

Published in: Electronic Journal of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1206.0313


62J07: Ridge regression; shrinkage estimators (Lasso)

62J05: Linear regression; mixed models


Related Items

Something Borrowed, Something New: Precise Prediction of Outcomes from Diverse Genomic Profiles, Data-Driven Discovery of Closure Models, Sparsest representations and approximations of an underdetermined linear system, Goodness-of-Fit Tests for High Dimensional Linear Models, The homotopy method revisited: Computing solution paths of $\ell _1$-regularized problems, On Computationally Tractable Selection of Experiments in Measurement-Constrained Regression Models, Gap Safe screening rules for sparsity enforcing penalties, Monte Carlo Simulation for Lasso-Type Problems by Estimator Augmentation, Unnamed Item, A study on tuning parameter selection for the high-dimensional lasso, Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms, On the Length of Post-Model-Selection Confidence Intervals Conditional on Polyhedral Constraints, Unnamed Item, On the uniqueness of solutions for the basis pursuit in the continuum, An Alternating Method for Cardinality-Constrained Optimization: A Computational Study for the Best Subset Selection and Sparse Portfolio Problems, A priori sparsification of Galerkin models, Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models, In defense of LASSO, Post-selection inference of generalized linear models based on the lasso and the elastic net, Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained 1 minimization, Thel1-based sparsification of energy interactions in unsteady lid-driven cavity flow, TV-based reconstruction of periodic functions, On the Probabilistic Cauchy Theory for Nonlinear Dispersive PDEs, Sparse low-rank separated representation models for learning from data, CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration, Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery, Consistent parameter estimation for Lasso and approximate message passing, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Unnamed Item, Sparse Identification and Estimation of Large-Scale Vector AutoRegressive Moving Averages, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Estimation of the Spatial Weighting Matrix for Spatiotemporal Data under the Presence of Structural Breaks, Controlling False Discovery Rate Using Gaussian Mirrors, Debiasing convex regularized estimators and interval estimation in linear models, A convex-Nonconvex strategy for grouped variable selection, LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing, The Geometry of Sparse Analysis Regularization, Variable selection and regularization via arbitrary rectangle-range generalized elastic net, On sparsity‐inducing methods in system identification and state estimation, Estimation of high-dimensional graphical models using regularized score matching, Analysis of a nonsmooth optimization approach to robust estimation, Inference in adaptive regression via the Kac-Rice formula, Exact post-selection inference, with application to the Lasso, The use of vector bootstrapping to improve variable selection precision in Lasso models, The geometry of least squares in the 21st century, Two-sided space-time \(L^1\) polynomial approximation of hypographs within polynomial optimal control, On cross-validated Lasso in high dimensions, Hybrid safe-strong rules for efficient optimization in Lasso-type problems, Model selection consistency of Lasso for empirical data, Degrees of freedom for piecewise Lipschitz estimators, Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization, Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models, A forward and backward stagewise algorithm for nonconvex loss functions with adaptive Lasso, LARS-type algorithm for group Lasso, Maximal solutions of sparse analysis regularization, Efficient Bayesian regularization for graphical model selection, The generalized Lasso problem and uniqueness, Machine learning subsurface flow equations from data, Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, Free disposal hull condition to verify when efficiency coincides with weak efficiency, Sparsest piecewise-linear regression of one-dimensional data, In defense of the indefensible: a very naïve approach to high-dimensional inference, Feature selection for data integration with mixed multiview data, LASSO for streaming data with adaptative filtering, The inverse problem for conducting defective lattices, Random weighting in LASSO regression, Adaptive multi-penalty regularization based on a generalized Lasso path, On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions, Primal path algorithm for compositional data analysis, An inexact proximal generalized alternating direction method of multipliers, Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons, Oscillation of metropolis-Hastings and simulated annealing algorithms around LASSO estimator, A significance test for the lasso, Discussion: ``A significance test for the lasso, Rejoinder: ``A significance test for the lasso, Necessary and sufficient conditions of solution uniqueness in 1-norm minimization, Robust elastic net estimators for variable selection and identification of proteomic biomarkers, Numerical analysis for conservation laws using \(l_1\) minimization, Safe feature elimination for non-negativity constrained convex optimization, A partially inexact proximal alternating direction method of multipliers and its iteration-complexity analysis, On Lasso refitting strategies, Risk bound of transfer learning using parametric feature mapping and its application to sparse coding, One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations, Leave-one-out cross-validation is risk consistent for Lasso, Iteration-complexity analysis of a generalized alternating direction method of multipliers, Quadratic growth conditions and uniqueness of optimal solution to Lasso, Convergence rates of the heavy-ball method under the Łojasiewicz property, When Ramanujan meets time-frequency analysis in complicated time series analysis, An Introduction to Compressed Sensing


Uses Software


Cites Work