Lagrangian relaxations on networks by \(\varepsilon \)-subgradient methods
From MaRDI portal
Publication:2429403
DOI10.1007/s10957-011-9881-8zbMath1254.90233OpenAlexW2082862030WikidataQ57397209 ScholiaQ57397209MaRDI QIDQ2429403
Publication date: 27 April 2012
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-011-9881-8
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An efficient method for nonlinearly constrained networks
- On large scale nonlinear network optimization
- Convergence of a simple subgradient level method
- Convergence of some algorithms for convex minimization
- An implementation of Newton-like methods on nonlinearly constrained networks
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Approximate subgradient methods for nonlinearly constrained network flow problems
- Incremental Subgradient Methods for Nondifferentiable Optimization
- On the Efficiency of the ε-Subgradient Methods Over Nonlinearly Constrained Networks
- Approximate Primal Solutions and Rate Analysis for Dual Subgradient Methods
- Large-scale linearly constrained optimization
- Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization