Stochastic modified flows for Riemannian stochastic gradient descent
DOI10.1137/24M163863XMaRDI QIDQ6658239FDOQ6658239
Authors: Benjamn Gess, Sebastian Kassing, Nimit Rana
Publication date: 8 January 2025
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
supervised learningdiffusion approximationweak errorRiemannian gradient flowRiemannian stochastic gradient descent
Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20) Numerical mathematical programming methods (65K05) Stochastic approximation (62L20) Diffusion processes and stochastic analysis on manifolds (58J65)
Cites Work
- The Geometry of Algorithms with Orthogonality Constraints
- Title not available (Why is that?)
- Stochastic analysis on manifolds
- Lie groups
- Title not available (Why is that?)
- Stochastic Equations in Infinite Dimensions
- Properties at infinity of diffusion semigroups and stochastic flows via weak uniform covers
- Fisher information distance: a geometrical reading
- The central limit problem for geodesic random walks
- Normally hyperbolic invariant manifolds. The noncompact case
- Strong \(p\)-completeness of stochastic differential equations and the existence of smooth flows on noncompact manifolds
- Stochastic Gradient Descent on Riemannian Manifolds
- Projection-like retractions on matrix manifolds
- Title not available (Why is that?)
- Cylindrical Wiener processes
- Stochastic partial differential equations: an introduction
- Stochastic approximation on Riemannian manifolds
- Uniform-in-time weak error analysis for stochastic gradient descent algorithms via diffusion approximation
- Gradient algorithms for principal component analysis
- Stochastic modified equations and dynamics of stochastic gradient algorithms. I: Mathematical foundations
- On the diffusion approximation of nonconvex stochastic gradient descent
- Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- Existence, uniqueness and regularity of the projection onto differentiable manifolds
- Semigroups of stochastic gradient descent and online principal component analysis: properties and diffusion approximations
- Convergence rates for the stochastic gradient descent method for non-convex objective functions
- On minimal representations of shallow ReLU networks
- Cooling down stochastic differential equations: Almost sure convergence
- An Introduction to Optimization on Smooth Manifolds
- Stochastic gradient descent with noise of machine learning type. II: Continuous time analysis
- Robust implicit regularization via weight normalization
- On uniform-in-time diffusion approximation for stochastic gradient descent
- On closed-form expressions for the Fisher-Rao distance
This page was built for publication: Stochastic modified flows for Riemannian stochastic gradient descent
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6658239)