Stochastic incremental mirror descent algorithms with Nesterov smoothing (Q6145577): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Optimal subgradient methods: computational properties for large-scale linear inverse problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: Duality Between Subgradient and Conditional Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: First-Order Methods in Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Mirror descent and nonlinear projected subgradient methods for convex optimization. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smoothing and First Order Methods: A Unified Framework / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variable smoothing for convex optimization problems using stochastic gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Ergodic Mirror Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: A proximal method for solving nonlinear minmax location problems with perturbed minimal time functions via conjugate duality / rank
 
Normal rank
Property / cites work
 
Property / cites work: Inexact stochastic mirror descent for two-stage nonlinear stochastic programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fastest rates for stochastic mirror descent methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Unifying mirror descent and dual averaging / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic mirror descent dynamics and their convergence in monotone variational inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3889672 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3967358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Primal-dual subgradient methods for convex problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth minimization of non-smooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Lectures on convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smoothing techniques and difference of convex functions algorithms for image reconstructions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Adaptive smoothing algorithms for nonsmooth composite convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convex Analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: A version of the mirror descent method to solve variational inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Convergence of Mirror Descent beyond Stochastic Convex Programming / rank
 
Normal rank

Revision as of 15:21, 22 August 2024

scientific article; zbMATH DE number 7785651
Language Label Description Also known as
English
Stochastic incremental mirror descent algorithms with Nesterov smoothing
scientific article; zbMATH DE number 7785651

    Statements

    Stochastic incremental mirror descent algorithms with Nesterov smoothing (English)
    0 references
    0 references
    0 references
    9 January 2024
    0 references
    This paper presents two incremental stochastic mirror descent algorithms for minimizing sums of finitely many non-smooth convex functions over convex sets. The Nesterov smoothing technique is used for the gradients of the smooth summands of the objective function instead of their subgradients. The functions are not required to be Lipschitz continuous or differentiable. Three algorithms are considered including an optimization problem in classifying images using support vector machines, a problem in tomography and a continuous location problem.
    0 references
    0 references
    stochastic
    0 references
    mirror descent
    0 references
    Nesterov smoothing
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers