The effect of deterministic noise in subgradient methods
From MaRDI portal
Publication:1960191
Recommendations
- Incremental subgradient methods for nondifferentiable optimization
- Inexact subgradient methods for quasi-convex optimization problems
- Convergence rate of incremental subgradient algorithms
- On convergence of the stochastic subgradient method with on-line stepsize rules
- Incremental stochastic subgradient algorithms for convex optimization
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 4091201 (Why is no real title available?)
- scientific article; zbMATH DE number 2121575 (Why is no real title available?)
- scientific article; zbMATH DE number 3333703 (Why is no real title available?)
- A Proximal Bundle Method with Approximate Subgradient Linearizations
- An Incremental Method for Solving Convex Finite Min-Max Problems
- Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
- Convergence of a simple subgradient level method
- Decomposition into functions in the minimization problem
- Distributed Average Consensus With Dithered Quantization
- Distributed asynchronous incremental subgradient methods
- Error bounds in mathematical programming
- Error stability properties of generalized gradient-type algorithms
- Incremental subgradient methods for nondifferentiable optimization
- Nonlinear programming methods in the presence of noise
- Quantized consensus
- The ordered subsets mirror descent optimization method with applications to tomography
- Weak Sharp Minima in Mathematical Programming
- stochastic quasigradient methods and their application to system optimization†
Cited in
(31)- On the convergence of gradient-like flows with noisy gradient input
- Convergence of inexact quasisubgradient methods with extrapolation
- Incremental stochastic subgradient algorithms for convex optimization
- Zero-convex functions, perturbation resilience, and subgradient projections for feasibility-seeking methods
- Generalised gossip-based subgradient method for distributed optimisation
- Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
- Subgradient method with feasible inexact projections for constrained convex optimization problems
- Distributed optimization with inexact oracle
- A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling
- Fault tolerant distributed portfolio optimization in smart grids
- A subgradient method with non-monotone line search
- A study on distributed optimization over large-scale networked systems
- Weak subgradient method for solving nonsmooth nonconvex optimization problems
- Incremental proximal methods for large scale convex optimization
- Faster subgradient methods for functions with Hölderian growth
- Minimizing Piecewise-Concave Functions Over Polyhedra
- A redistributed proximal bundle method for nonsmooth nonconvex functions with inexact information
- On finite termination of an inexact proximal point algorithm
- Stochastic derivative-free optimization using a trust region framework
- Adaptive Bundle Methods for Nonlinear Robust Optimization
- A splitting bundle approach for non-smooth non-convex minimization
- First-order methods of smooth convex optimization with inexact oracle
- A proximal bundle method for constrained nonsmooth nonconvex optimization with inexact information
- Abstract convergence theorem for quasi-convex optimization problems with applications
- Zeroth-order regularized optimization (ZORO): approximately sparse gradients and adaptive sampling
- Bundle method for non-convex minimization with inexact subgradients and function values
- Incremental quasi-subgradient methods for minimizing the sum of quasi-convex functions
- Stochastic mirror descent method for distributed multi-agent optimization
- A proximal bundle method for nonsmooth nonconvex functions with inexact information
- Inexact subgradient methods for quasi-convex optimization problems
- An infeasible-point subgradient method using adaptive approximate projections
This page was built for publication: The effect of deterministic noise in subgradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1960191)