Pages that link to "Item:Q403646"
From MaRDI portal
The following pages link to Subgradient methods for huge-scale optimization problems (Q403646):
Displaying 14 items.
- Parallel coordinate descent methods for big data optimization (Q263212) (← links)
- Nesterov's smoothing and excessive gap methods for an optimization problem in VLSI placement (Q489145) (← links)
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization (Q519779) (← links)
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (Q1677473) (← links)
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute'' gradient for structured convex optimization (Q2020608) (← links)
- Numerical study of high-dimensional optimization problems using a modification of Polyak's method (Q2048784) (← links)
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum (Q2287166) (← links)
- On the efficiency of a randomized mirror descent algorithm in online optimization problems (Q2354481) (← links)
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions (Q2359139) (← links)
- A subgradient method with non-monotone line search (Q2696908) (← links)
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization (Q2954396) (← links)
- Adaptive subgradient methods for mathematical programming problems with quasiconvex functions (Q6154812) (← links)
- Faster first-order primal-dual methods for linear programming using restarts and sharpness (Q6165583) (← links)
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming (Q6188510) (← links)