Pages that link to "Item:Q4979860"
From MaRDI portal
The following pages link to On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging (Q4979860):
Displaying 26 items.
- New results on subgradient methods for strongly convex optimization problems with a unified analysis (Q316174) (← links)
- Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures (Q526834) (← links)
- Stochastic mirror descent method for distributed multi-agent optimization (Q1670526) (← links)
- On smoothing, regularization, and averaging in stochastic approximation methods for stochastic variational inequality problems (Q1680973) (← links)
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes (Q1711086) (← links)
- Fastest rates for stochastic mirror descent methods (Q2044496) (← links)
- Inexact stochastic subgradient projection method for stochastic equilibrium problems with nonmonotone bifunctions: application to expected risk minimization in machine learning (Q2045021) (← links)
- A stochastic primal-dual method for optimization with conditional value at risk constraints (Q2046691) (← links)
- Stochastic approximation method using diagonal positive-definite matrices for convex optimization with fixed point constraints (Q2138441) (← links)
- Algorithms for stochastic optimization with function or expectation constraints (Q2181600) (← links)
- Exploiting problem structure in optimization under uncertainty via online convex optimization (Q2316616) (← links)
- A family of subgradient-based methods for convex optimization problems in a unifying framework (Q2829570) (← links)
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization (Q2954396) (← links)
- String-averaging incremental stochastic subgradient algorithms (Q4631774) (← links)
- On Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate Analysis (Q5107212) (← links)
- Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity (Q5231668) (← links)
- Generalised gossip-based subgradient method for distributed optimisation (Q5382986) (← links)
- On the Convergence of Mirror Descent beyond Stochastic Convex Programming (Q5853716) (← links)
- Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity (Q5869813) (← links)
- Probabilistic robustness estimates for feed-forward neural networks (Q6079061) (← links)
- A framework of convergence analysis of mini-batch stochastic projected gradient methods (Q6097385) (← links)
- Stochastic mirror descent method for linear ill-posed problems in Banach spaces (Q6101039) (← links)
- Stochastic approximation with discontinuous dynamics, differential inclusions, and applications (Q6103982) (← links)
- Federated learning for minimizing nonsmooth convex loss functions (Q6112869) (← links)
- Stochastic incremental mirror descent algorithms with Nesterov smoothing (Q6145577) (← links)
- SPIRAL: a superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization (Q6498409) (← links)