Smooth over-parameterized solvers for non-smooth structured optimization
From MaRDI portal
Publication:6110460
DOI10.1007/s10107-022-01923-3zbMath1522.90140arXiv2205.01385OpenAlexW4319459525MaRDI QIDQ6110460
Publication date: 1 August 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2205.01385
Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Learning and adaptive systems in artificial intelligence (68T05) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Nonlinear total variation based noise removal algorithms
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Schur complements and its applications to symmetric nonnegative and \(Z\)-matrices
- Convex multi-task feature learning
- Locally adaptive regression splines
- Image recovery via total variation minimization and related problems
- An algorithm for total variation minimization and applications
- A variational approach to remove outliers and impulse noise
- Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Regularizers for structured sparsity
- Single-exponential bounds for the smallest singular value of Vandermonde matrices in the sub-Rayleigh regime
- Adaptive restart for accelerated gradient schemes
- A proximal point analysis of the preconditioned alternating direction method of multipliers
- Optimization with Sparsity-Inducing Penalties
- χ 2-Confidence Sets in High-Dimensional Regression
- Robust principal component analysis?
- Trace Norm Regularization: Reformulations, Algorithms, and Multi-Task Learning
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Iteratively reweighted least squares minimization for sparse recovery
- Two-Point Step Size Gradient Methods
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Algorithms for Separable Nonlinear Least Squares Problems
- Variational Analysis
- Separable nonlinear least squares: the variable projection method and its applications
- Gap Safe screening rules for sparsity enforcing penalties
- Adapting Regularized Low-Rank Models for Parallel Architectures
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Iterative Methods for Total Variation Denoising
- Safe Feature Elimination in Sparse Supervised Learning
- Inverse problems in spaces of measures
- Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization
- On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
- Sparse regularization on thin grids I: the Lasso
- Towards a Mathematical Theory of Super‐resolution
- Model Selection and Estimation in Regression with Grouped Variables
- Variable metric forward–backward splitting with applications to monotone inclusions in duality
- The Differentiation of Pseudo-Inverses and Nonlinear Least Squares Problems Whose Variables Separate
- Generalized Projection Operators in Banach Spaces: Properties and Applications
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Local linear convergence analysis of Primal–Dual splitting methods
- Sparse Image and Signal Processing
- Convergence rates of gradient methods for convex optimization in the space of measures
- Unnamed Item
- Unnamed Item
- Unnamed Item