A Laplacian approach to _1-norm minimization
From MaRDI portal
Publication:2044482
DOI10.1007/S10589-021-00270-XzbMATH Open1473.90157arXiv1901.08836OpenAlexW3139482808MaRDI QIDQ2044482FDOQ2044482
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: We propose a novel differentiable reformulation of the linearly-constrained minimization problem, also known as the basis pursuit problem. The reformulation is inspired by the Laplacian paradigm of network theory and leads to a new family of gradient-based methods for the solution of minimization problems. We analyze the iteration complexity of a natural solution approach to the reformulation, based on a multiplicative weights update scheme, as well as the iteration complexity of an accelerated gradient scheme. The results can be seen as bounds on the complexity of iteratively reweighted least squares (IRLS) type methods of basis pursuit.
Full work available at URL: https://arxiv.org/abs/1901.08836
convex optimizationmultiplicative weightsiteratively reweighted least squaresbasis pursuit\(\ell_1 \) regressionLaplacian paradigm
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- The graphical lasso: new insights and alternatives
- PhaseMax: Convex Phase Retrieval via Basis Pursuit
- Smooth minimization of non-smooth functions
- First-Order Methods in Optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Convex Analysis
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Optimization with sparsity-inducing penalties
- A mathematical introduction to compressive sensing
- Evolutionary Games and Population Dynamics
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- Iteratively reweighted least squares minimization for sparse recovery
- A mathematical model for adaptive transport network in path finding by true slime mold
- Linear models and generalizations. Least squares and alternatives. With contributions by Michael Schomaker.
- Atomic decomposition by basis pursuit
- The multiplicative weights update method: a meta-algorithm and applications
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Electrical flows, laplacian systems, and faster approximation of maximum flow in undirected graphs
- Hessian Riemannian Gradient Flows in Convex Programming
- Convergence of the linearized Bregman iteration for ℓ₁-norm minimization
- Minimizing Effective Resistance of a Graph
- Natural Algorithms for Flow Problems
- An improved algorithm for basis pursuit problem and its applications
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- On the Convergence Time of a Natural Dynamics for Linear Programming
- Advanced Time Series Data Analysis
- Two results on slime mold computations
- Runtime guarantees for regression problems
- Towards a Stationary Monge--Kantorovich Dynamics: The Physarum Polycephalum Experience
Cited In (5)
Uses Software
This page was built for publication: A Laplacian approach to \(\ell_1\)-norm minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044482)