An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems
From MaRDI portal
Publication:825333
DOI10.1007/s42081-020-00101-zzbMath1477.62017OpenAlexW3120742959WikidataQ114687074 ScholiaQ114687074MaRDI QIDQ825333
Publication date: 17 December 2021
Published in: Japanese Journal of Statistics and Data Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s42081-020-00101-z
model selectionnonconvex optimizationrandomized algorithmsproximal operatorsblockwise coordinate descent algorithms
Computational methods for problems pertaining to statistics (62-08) Linear regression; mixed models (62J05) Numerical mathematical programming methods (65K05)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Best subset selection via a modern optimization lens
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- A majorization-minimization approach to variable selection using spike and slab priors
- Incremental proximal methods for large scale convex optimization
- Iterative hard thresholding for compressed sensing
- Introductory lectures on convex optimization. A basic course.
- Structured variable selection via prior-induced hierarchical penalty functions
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Coordinate descent with arbitrary sampling I: algorithms and complexity†
- Proximal Splitting Methods in Signal Processing
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Robust Stochastic Approximation Approach to Stochastic Programming
- Sparse Reconstruction by Separable Approximation
- Adaptive Randomized Coordinate Descent for Sparse Systems: Lasso and Greedy Algorithms
- Distributed optimization with arbitrary local solvers
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Conditional Gradient Algorithmsfor Rank-One Matrix Approximations with a Sparsity Constraint
- On the Convergence of Block Coordinate Descent Type Methods
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- A Stochastic Approximation Method
- Convex analysis and monotone operator theory in Hilbert spaces
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
This page was built for publication: An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems