The following pages link to Anton Rodomanov (Q2031937):
Displaying 10 items.
- New results on superlinear convergence of classical quasi-Newton methods (Q2031938) (← links)
- Rates of superlinear convergence for classical quasi-Newton methods (Q2149549) (← links)
- Smoothness parameter of power of Euclidean norm (Q2178876) (← links)
- A Randomized Coordinate Descent Method with Volume Sampling (Q3300772) (← links)
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence (Q5853572) (← links)
- Subgradient ellipsoid method for nonsmooth convex problems (Q6038646) (← links)
- Universal Gradient Methods for Stochastic Convex Optimization (Q6520435) (← links)
- Global Complexity Analysis of BFGS (Q6531761) (← links)
- Universality of AdaGrad Stepsizes for Stochastic Optimization: Inexact Oracle, Acceleration and Variance Reduction (Q6732032) (← links)
- Optimizing $(L_0, L_1)$-Smooth Functions by Gradient Methods (Q6748803) (← links)