Subgradient methods for huge-scale optimization problems
From MaRDI portal
Recommendations
- Primal-dual subgradient method for huge-scale linear conic problems
- Convergence rate of incremental subgradient algorithms
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Non-Euclidean restricted memory level method for large-scale convex optimization
- Incremental subgradient methods for nondifferentiable optimization
Cites work
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 4123531 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 3894826 (Why is no real title available?)
- Characterizations of linear suboptimality for mathematical programs with equilibrium constraints
- Efficiency of coordinate descent methods on huge-scale optimization problems
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Smooth minimization of non-smooth functions
Cited in
(29)- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- An acceleration procedure for optimal first-order methods
- On stochastic accelerated gradient with convergence rate of regression learning
- Subgradient method with feasible inexact projections for constrained convex optimization problems
- Faster first-order primal-dual methods for linear programming using restarts and sharpness
- A subgradient method with non-monotone line search
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- On solving the densest \(k\)-subgraph problem on large graphs
- Nesterov's smoothing and excessive gap methods for an optimization problem in VLSI placement
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Stochastic block mirror descent methods for nonsmooth and stochastic optimization
- On convergence of a \(q\)-random coordinate constrained algorithm for non-convex problems
- The substitution secant/finite difference method for large scale sparse unconstrained optimization
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum
- Efficient numerical methods to solve sparse linear equations with application to PageRank
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- Control analysis and design via randomised coordinate polynomial minimisation
- On the efficiency of a randomized mirror descent algorithm in online optimization problems
- Non-Euclidean restricted memory level method for large-scale convex optimization
- Adaptive subgradient methods for mathematical programming problems with quasiconvex functions
- Parallel coordinate descent methods for big data optimization
- scientific article; zbMATH DE number 1264398 (Why is no real title available?)
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- Numerical study of high-dimensional optimization problems using a modification of Polyak's method
- Primal-dual subgradient method for huge-scale linear conic problems
This page was built for publication: Subgradient methods for huge-scale optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q403646)