Sparse optimization on measures with over-parameterized gradient descent
DOI10.1007/S10107-021-01636-ZzbMATH Open1494.90082arXiv1907.10300OpenAlexW3158438262MaRDI QIDQ2149558FDOQ2149558
Authors: Lénaïc Chizat
Publication date: 29 June 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.10300
Recommendations
- Global convergence analysis of sparse regular nonconvex optimization problems
- Analysis of a two-layer neural network via displacement convexity
- Gradient descent with non-convex constraints: local concavity determines convergence
- Linear convergence of accelerated conditional gradient algorithms in spaces of measures
- On the minimization of a Tikhonov functional with a non-convex sparsity constraint
Numerical optimization and variational techniques (65K10) Nonconvex programming, global optimization (90C26) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Title not available (Why is that?)
- Optimal entropy-transport problems and a new Hellinger-Kantorovich distance between positive measures
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- Title not available (Why is that?)
- Title not available (Why is that?)
- Gradient flows in metric spaces and in the space of probability measures
- Title not available (Why is that?)
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Title not available (Why is that?)
- Title not available (Why is that?)
- A course in metric geometry
- Title not available (Why is that?)
- Optimization with sparsity-inducing penalties
- Title not available (Why is that?)
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- On the Rate of Convergence of Empirical Measures in ∞-transportation Distance
- Sparse modeling for image and vision processing
- Towards a Mathematical Theory of Super‐resolution
- A new optimal transport distance on the space of finite Radon measures
- An interpolating distance between optimal transport and Fisher-Rao metrics
- Exact reconstruction using Beurling minimal extrapolation
- Compressed Sensing Off the Grid
- A JKO Splitting Scheme for Kantorovich--Fisher--Rao Gradient Flows
- Positive trigonometric polynomials and signal processing applications
- Exact support recovery for sparse spikes deconvolution
- Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance
- Probabilistic representation and uniqueness results for measure-valued solutions of transport equations
- Poincaré and logarithmic Sobolev inequalities by decomposition of the energy landscape
- Exact Solutions to Super Resolution on Semi-Algebraic Domains in Higher Dimensions
- Unbalanced optimal transport: dynamic and Kantorovich formulations
- Title not available (Why is that?)
- Inverse problems in spaces of measures
- On Representer Theorems and Convex Regularization
- The Alternating Descent Conditional Gradient Method for Sparse Inverse Problems
- Natural gradient via optimal transport
- The basins of attraction of the global minimizers of the non-convex sparse spike estimation problem
- The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy
- A mean field view of the landscape of two-layer neural networks
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Exact solutions of infinite dimensional total-variation regularized problems
- A family of functional inequalities: Łojasiewicz inequalities and displacement convex functions
- Mean Field Analysis of Neural Networks: A Law of Large Numbers
- Kurdyka–Łojasiewicz–Simon inequality for gradient flows in metric spaces
- Accelerated information gradient flow
- On the linear convergence rates of exchange and continuous methods for total variation minimization
Cited In (9)
- Estimation of off-the grid sparse spikes with over-parametrized projected gradient descent: theory and application
- Proximal methods for point source localisation
- A rigorous framework for the mean field limit of multilayer neural networks
- On the uniqueness of solutions for the basis pursuit in the continuum
- Learning sparse features can lead to overfitting in neural networks
- Localization of point scatterers via sparse optimization on measures
- Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary
- Regularizing Orientation Estimation in Cryogenic Electron Microscopy Three-Dimensional Map Refinement through Measure-Based Lifting over Riemannian Manifolds
- Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation
This page was built for publication: Sparse optimization on measures with over-parameterized gradient descent
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2149558)