Pages that link to "Item:Q5085262"
From MaRDI portal
The following pages link to Primal–dual accelerated gradient methods with small-dimensional relaxation oracle (Q5085262):
Displaying 14 items.
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (Q2026726) (← links)
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (Q2031939) (← links)
- Improved exploitation of higher order smoothness in derivative-free optimization (Q2162687) (← links)
- Stochastic saddle-point optimization for the Wasserstein barycenter problem (Q2162697) (← links)
- Alternating minimization methods for strongly convex optimization (Q2232092) (← links)
- Network Utility Maximization by Updating Individual Transmission Rates (Q5054167) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Inexact model: a framework for optimization and variational inequalities (Q5865338) (← links)
- Accelerated methods for weakly-quasi-convex optimization problems (Q6134309) (← links)
- Multistage transportation model and sufficient conditions for its potentiality (Q6153312) (← links)
- Smooth monotone stochastic variational inequalities and saddle point problems: a survey (Q6160072) (← links)
- Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems (Q6601206) (← links)
- Geodesic convexity of the symmetric eigenvalue problem and convergence of steepest descent (Q6636815) (← links)
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods (Q6644994) (← links)