Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
From MaRDI portal
Publication:1949275
DOI10.1007/s10107-012-0541-zzbMath1297.90118OpenAlexW2134410028MaRDI QIDQ1949275
Elizabeth W. Karas, Clóvis C. Gonzaga
Publication date: 6 May 2013
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-012-0541-z
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25)
Related Items
A secant-based Nesterov method for convex functions, An adaptive accelerated first-order method for convex optimization, OSGA: a fast subgradient algorithm with optimal complexity, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, An optimal subgradient algorithm with subspace search for costly convex optimization problems, A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors, Empirical risk minimization: probabilistic complexity and stepsize strategy, Optimal subgradient algorithms for large-scale convex optimization in simple domains, IFORS' Operational Research Hall of Fame: Clóvis Caesar Gonzaga, Hardy-type results on the average of the lattice point error term over long intervals, A Barzilai-Borwein type method for minimizing composite functions, Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\), Performance of first-order methods for smooth convex minimization: a novel approach, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, Optimal subgradient methods: computational properties for large-scale linear inverse problems, An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Smoothing technique and its applications in semidefinite optimization
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- Improved Algorithms for Convex Minimization in Relative Scale
- Numerical Optimization
- Line search algorithms with guaranteed sufficient decrease