Finito
From MaRDI portal
Cited in
(33)- Surpassing gradient descent provably: a cyclic incremental method with linear convergence rate
- scientific article; zbMATH DE number 7306860 (Why is no real title available?)
- Adaptivity of stochastic gradient methods for nonconvex optimization
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Stochastic nested variance reduction for nonconvex optimization
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Catalyst acceleration for first-order convex optimization: from theory to practice
- A distributed flexible delay-tolerant proximal gradient algorithm
- A smooth inexact penalty reformulation of convex problems with linear constraints
- IQN: an incremental quasi-Newton method with local superlinear convergence rate
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching
- ARock
- Cyanure
- ProxSARAH
- Saga
- Proximal average approximated incremental gradient descent for composite penalty regularized empirical risk minimization
- Global convergence rate of proximal incremental aggregated gradient methods
- Distributed stochastic variance reduced gradient methods by sampling extra data with replacement
- A hybrid stochastic optimization framework for composite nonconvex optimization
- Cocoercivity, smoothness and bias in variance-reduced stochastic gradient methods
- Stochastic DCA for minimizing a large sum of DC functions with application to multi-class logistic regression
- Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions
- adaQN
- Alpaqa
- scientific article; zbMATH DE number 7626722 (Why is no real title available?)
- Variance reduction for root-finding problems
- Linear convergence of cyclic SAGA
- scientific article; zbMATH DE number 7625177 (Why is no real title available?)
- Stochastic sub-sampled Newton method with variance reduction
- Bregman Finito/MISO for nonconvex regularized finite sum minimization without Lipschitz gradient continuity
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- SpiderBoost
This page was built for software: Finito