Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization

From MaRDI portal
Publication:2954396

DOI10.1137/130936361zbMath1353.90095arXiv1309.2249OpenAlexW1975768153MaRDI QIDQ2954396

Guanghui Lan, Cong D. Dang

Publication date: 13 January 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1309.2249




Related Items (30)

Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizesBlock coordinate type methods for optimization and learningA fully stochastic second-order trust region methodOn optimal probabilities in stochastic coordinate descent methodsDistributed constraint-coupled optimization via primal decomposition over random time-varying graphsBlock Stochastic Gradient Iteration for Convex and Nonconvex OptimizationBlock Policy Mirror DescentOn the convergence of asynchronous parallel iteration with unbounded delaysA stochastic variance reduction algorithm with Bregman distances for structured composite problemsBlock mirror stochastic gradient method for stochastic optimizationUnifying framework for accelerated randomized methods in convex optimizationPenalty methods with stochastic approximation for stochastic nonlinear programmingOn stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemesOptimization-Based Calibration of Simulation Input ModelsConditional gradient type methods for composite nonlinear and stochastic optimizationRecent Theoretical Advances in Non-Convex OptimizationA Method with Convergence Rates for Optimization Problems with Variational Inequality ConstraintsEfficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max ProblemsStochastic Quasi-Newton Methods for Nonconvex Stochastic OptimizationPoint process estimation with Mirror Prox algorithmsAn accelerated directional derivative method for smooth stochastic convex optimizationA unified convergence analysis of stochastic Bregman proximal gradient and extragradient methodsFastest rates for stochastic mirror descent methodsMarkov chain block coordinate descentRandomized primal-dual proximal block coordinate updatesAccelerated Stochastic Algorithms for Nonconvex Finite-Sum and Multiblock OptimizationA Stochastic Semismooth Newton Method for Nonsmooth Nonconvex OptimizationOn the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimizationSmoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex ProgramsAccelerated gradient methods for nonconvex nonlinear and stochastic programming



Cites Work


This page was built for publication: Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization