Algorithms of robust stochastic optimization based on mirror descent method
From MaRDI portal
Abstract: We propose an approach to construction of robust non-Euclidean iterative algorithms for convex composite stochastic optimization based on truncation of stochastic gradients. For such algorithms, we establish sub-Gaussian confidence bounds under weak assumptions about the tails of the noise distribution in convex and strongly convex settings. Robust estimates of the accuracy of general stochastic algorithms are also proposed.
Recommendations
- Algorithms of inertial mirror descent in convex problems of stochastic optimization
- Algorithms of inertial mirror descent in stochastic convex optimization problems
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Primal-dual mirror descent method for constraint stochastic optimization problems
- Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures
Cites work
- scientific article; zbMATH DE number 3924653 (Why is no real title available?)
- scientific article; zbMATH DE number 3790207 (Why is no real title available?)
- scientific article; zbMATH DE number 1110192 (Why is no real title available?)
- scientific article; zbMATH DE number 3320125 (Why is no real title available?)
- Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
- Adaptive estimation algorithms (convergence, optimality, stability)
- An optimal method for stochastic composite optimization
- Analysis of robust stochastic approximation algorithms for process identification
- Bandits With Heavy Tail
- Challenging the empirical mean and empirical variance: a deviation study
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Convergence and robustness of the Robbins-Monro algorithm truncated at randomly varying bounds
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Geometric median and robust estimation in Banach spaces
- Loss minimization and parameter estimation with heavy tails
- On tail probabilities for martingales
- Online estimation of the geometric median in Hilbert spaces: nonasymptotic confidence balls
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Optimal and robust kernel algorithms for passive stochastic approximation
- Risk minimization by median-of-means tournaments
- Robust Estimation of a Location Parameter
- Robust Statistics
- Robust Stochastic Approximation Approach to Stochastic Programming
- Robust classification via MOM minimization
- Robust estimation using the Robbins-Monro stochastic approximation algorithm
- Robust estimation via stochastic approximation
- Robust identification
- Robust linear least squares regression
- Robust pseudogradient adaptation algorithms
- Robustness analysis for stochastic approximation algorithms
- Stochastic approximation for multivariate and functional median
- Sub-Gaussian estimators of the mean of a random vector
- Sub-Gaussian mean estimators
- The 1972 Wald Lecture Robust Statistics: A Review
Cited in
(13)- ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels
- Validation analysis of mirror descent stochastic approximation method
- scientific article; zbMATH DE number 7370566 (Why is no real title available?)
- Algorithms of inertial mirror descent in convex problems of stochastic optimization
- Algorithms of inertial mirror descent in stochastic convex optimization problems
- Nonlinear Gradient Mappings and Stochastic Optimization: A General Framework with Applications to Heavy-Tail Noise
- Saddle point mirror descent algorithm for the robust PageRank problem
- Optimal robust mean and location estimation via convex programs with respect to any pseudo-norms
- High probability bounds for stochastic subgradient schemes with heavy tailed noise]
- Algorithms with gradient clipping for stochastic optimization with heavy-tailed noise
- High-probability complexity bounds for non-smooth stochastic convex optimization with heavy-tailed noise
- scientific article; zbMATH DE number 7625199 (Why is no real title available?)
- Gradient-free methods for non-smooth convex stochastic optimization with heavy-tailed noise on convex compact
This page was built for publication: Algorithms of robust stochastic optimization based on mirror descent method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2289049)