Pages that link to "Item:Q403634"
From MaRDI portal
The following pages link to First-order methods of smooth convex optimization with inexact oracle (Q403634):
Displaying 50 items.
- An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints (Q255080) (← links)
- An adaptive accelerated first-order method for convex optimization (Q276852) (← links)
- OSGA: a fast subgradient algorithm with optimal complexity (Q304218) (← links)
- Inexact coordinate descent: complexity and preconditioning (Q306308) (← links)
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- New results on subgradient methods for strongly convex optimization problems with a unified analysis (Q316174) (← links)
- Universal gradient methods for convex optimization problems (Q494332) (← links)
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization (Q519779) (← links)
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle (Q727222) (← links)
- On the resolution of misspecified convex optimization and monotone variational inequality problems (Q782913) (← links)
- A flexible coordinate descent method (Q1639710) (← links)
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models (Q1646566) (← links)
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization (Q1670100) (← links)
- Optimal subgradient algorithms for large-scale convex optimization in simple domains (Q1689457) (← links)
- A primal majorized semismooth Newton-CG augmented Lagrangian method for large-scale linearly constrained convex programming (Q1694389) (← links)
- Conditional gradient type methods for composite nonlinear and stochastic optimization (Q1717236) (← links)
- Management of a hydropower system via convex duality (Q1731594) (← links)
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis (Q1734769) (← links)
- Universal method for stochastic composite optimization problems (Q1746349) (← links)
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) (Q1752352) (← links)
- Flexible low-rank statistical modeling with missing data and side information (Q1799348) (← links)
- Rate of convergence analysis of discretization and smoothing algorithms for semiinfinite minimax problems (Q1935267) (← links)
- Composite convex optimization with global and local inexact oracles (Q1986105) (← links)
- Analysis of biased stochastic gradient descent using sequential semidefinite programs (Q2020610) (← links)
- Bounds for the tracking error of first-order online optimization methods (Q2032000) (← links)
- General convergence analysis of stochastic first-order methods for composite optimization (Q2032020) (← links)
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (Q2042418) (← links)
- A frequency-domain analysis of inexact gradient methods (Q2149575) (← links)
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle (Q2159456) (← links)
- Inexact first-order primal-dual algorithms (Q2181598) (← links)
- Efficient first-order methods for convex minimization: a constructive approach (Q2205976) (← links)
- A heuristic adaptive fast gradient method in stochastic optimization problems (Q2207619) (← links)
- Augmented Lagrangian optimization under fixed-point arithmetic (Q2208540) (← links)
- Accelerated methods for saddle-point problem (Q2214606) (← links)
- HT-AWGM: a hierarchical Tucker-adaptive wavelet Galerkin method for high-dimensional elliptic problems (Q2216606) (← links)
- Optimization for deep learning: an overview (Q2218095) (← links)
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems (Q2220653) (← links)
- Frank-Wolfe and friends: a journey into projection-free first-order optimization methods (Q2240671) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Generalized uniformly optimal methods for nonlinear programming (Q2316202) (← links)
- Efficiency of minimizing compositions of convex functions and smooth maps (Q2330660) (← links)
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs (Q2392808) (← links)
- Distributed optimal coordination for multiple heterogeneous Euler-Lagrangian systems (Q2409345) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- Convergence rates of accelerated proximal gradient algorithms under independent noise (Q2420162) (← links)
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization (Q2420797) (← links)
- Accelerated gradient boosting (Q2425242) (← links)
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization (Q2515032) (← links)
- Stochastic intermediate gradient method for convex optimization problems (Q2631196) (← links)