Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures (Q526834)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures |
scientific article |
Statements
Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures (English)
0 references
15 May 2017
0 references
The author considers the following convex stochastic optimization problem \[ \begin{cases} \min f\left( x\right) :=\mathcal{R}\left[ g\left( x,\xi \right) \right] , \\ x\in X, \end{cases}\eqno{(1)} \] where \(\xi \in L_{p}\left( \Omega ,\mathcal{F},\mathbb{P};\mathbb{R} ^{s}\right) \) is a random vector with support \(\Xi \), \(g:E\times \mathbb{R} ^{s}\rightarrow \mathbb{R}\) is a Borel function which is convex in \(x\) for every \(\xi \) and \(\mathbb{P}\)-summable in \(\mathbb{\xi }\) for every \(x\); \(X\) is a closed and bounded convex set in a Euclidean space \(E\) and \(\mathcal{R} \) is an extended polyhedral risk measure. The author obtains online nonasymptotic computable confidence intervals for the optimal value of (1) using as estimators of the optimal value variants of the stochastic mirror descent (SMD) algorithm. When the objective functions are uniformly convex, he also proposes a multistep extension of the SMD algorithm and obtains confidence intervals on both the optimal values and optimal solutions. The author shows (using two stochastic optimization problems) that the multistep SMD algorithm can obtain ``good'' solutions much quicker that the SMD algorithm.
0 references
stochastic optimization
0 references
risk measures
0 references
multistep stochastic mirror descent
0 references
robust stochastic optimization
0 references
0 references
0 references
0 references
0 references
0 references