Pages that link to "Item:Q5525645"
From MaRDI portal
The following pages link to Some methods of speeding up the convergence of iteration methods (Q5525645):
Displaying 6 items.
- On the convergence analysis of the optimized gradient method (Q511969) (← links)
- A proximal difference-of-convex algorithm with extrapolation (Q1744881) (← links)
- Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition (Q2185767) (← links)
- Proximal extrapolated gradient methods for variational inequalities (Q4638915) (← links)
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems (Q4687235) (← links)
- Subsampled Hessian Newton Methods for Supervised Learning (Q5380307) (← links)