Pages that link to "Item:Q1949275"
From MaRDI portal
The following pages link to Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming (Q1949275):
Displayed 15 items.
- An adaptive accelerated first-order method for convex optimization (Q276852) (← links)
- OSGA: a fast subgradient algorithm with optimal complexity (Q304218) (← links)
- A Barzilai-Borwein type method for minimizing composite functions (Q494671) (← links)
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (Q1677473) (← links)
- Optimal subgradient algorithms for large-scale convex optimization in simple domains (Q1689457) (← links)
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) (Q1752352) (← links)
- Performance of first-order methods for smooth convex minimization: a novel approach (Q2248759) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Optimal subgradient methods: computational properties for large-scale linear inverse problems (Q2315075) (← links)
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization (Q2352420) (← links)
- A secant-based Nesterov method for convex functions (Q2361131) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- Empirical risk minimization: probabilistic complexity and stepsize strategy (Q2419551) (← links)
- Hardy-type results on the average of the lattice point error term over long intervals (Q4604403) (← links)
- A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors (Q5346620) (← links)