Finito
From MaRDI portal
Software:53976
No author found.
Source code repository: https://github.com/kul-forbes/CIAOAlgorithms.jl
Related Items (26)
Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems ⋮ Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization ⋮ Proximal average approximated incremental gradient descent for composite penalty regularized empirical risk minimization ⋮ Unnamed Item ⋮ Cocoercivity, smoothness and bias in variance-reduced stochastic gradient methods ⋮ Linear convergence of cyclic SAGA ⋮ Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning ⋮ Variance reduction for root-finding problems ⋮ A Smooth Inexact Penalty Reformulation of Convex Problems with Linear Constraints ⋮ Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods ⋮ Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice ⋮ IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate ⋮ Analysis of biased stochastic gradient descent using sequential semidefinite programs ⋮ Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions ⋮ Stochastic sub-sampled Newton method with variance reduction ⋮ Stochastic quasi-gradient methods: variance reduction via Jacobian sketching ⋮ A Distributed Flexible Delay-Tolerant Proximal Gradient Algorithm ⋮ Stochastic DCA for minimizing a large sum of DC functions with application to multi-class logistic regression ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity ⋮ A hybrid stochastic optimization framework for composite nonconvex optimization
This page was built for software: Finito