Independently Interpretable Lasso for Generalized Linear Models
From MaRDI portal
Publication:5131139
DOI10.1162/neco_a_01279zbMath1473.68171OpenAlexW3019502473WikidataQ94445434 ScholiaQ94445434MaRDI QIDQ5131139
Taiji Suzuki, Masaaki Takada, Hironori Fujisawa
Publication date: 2 November 2020
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_01279
Ridge regression; shrinkage estimators (Lasso) (62J07) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Statistics for high-dimensional data. Methods, theory and applications.
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- SLOPE-adaptive variable selection via convex optimization
- Simultaneous analysis of Lasso and Dantzig selector
- Pathwise coordinate optimization
- High-dimensional graphs and variable selection with the Lasso
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Just relax: convex programming methods for identifying sparse signals in noise
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Regularization and Variable Selection Via the Elastic Net
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Independently Interpretable Lasso for Generalized Linear Models