The following pages link to Finito (Q53976):
Displaying 26 items.
- Analysis of biased stochastic gradient descent using sequential semidefinite programs (Q2020610) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Stochastic DCA for minimizing a large sum of DC functions with application to multi-class logistic regression (Q2057761) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems (Q2133414) (← links)
- Linear convergence of cyclic SAGA (Q2193004) (← links)
- Proximal average approximated incremental gradient descent for composite penalty regularized empirical risk minimization (Q2398094) (← links)
- Cocoercivity, smoothness and bias in variance-reduced stochastic gradient methods (Q2674579) (← links)
- Variance reduction for root-finding problems (Q2689823) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- (Q4637046) (← links)
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods (Q4641660) (← links)
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate (Q4641666) (← links)
- Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions (Q4687243) (← links)
- (Q4969167) (← links)
- (Q5053196) (← links)
- (Q5054622) (← links)
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Q5076671) (← links)
- (Q5148937) (← links)
- A Smooth Inexact Penalty Reformulation of Convex Problems with Linear Constraints (Q5152474) (← links)
- Stochastic sub-sampled Newton method with variance reduction (Q5204645) (← links)
- A Distributed Flexible Delay-Tolerant Proximal Gradient Algorithm (Q5220423) (← links)
- An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration (Q5231671) (← links)
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning (Q5254990) (← links)
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate (Q5745078) (← links)
- Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity (Q5869813) (← links)