NESUN

From MaRDI portal
Revision as of 20:46, 5 March 2024 by Import240305080343 (talk | contribs) (Created automatically from import240305080343)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Software:40447



swMATH28733MaRDI QIDQ40447


No author found.





Related Items (50)

On the global convergence rate of the gradient descent method for functions with Hölder continuous gradientsNew results on subgradient methods for strongly convex optimization problems with a unified analysisGeneralized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracleZeroth-order methods for noisy Hölder-gradient functionsAdaptivity of Stochastic Gradient Methods for Nonconvex OptimizationAccelerated Extra-Gradient Descent: A Novel Accelerated First-Order MethodPrimal–dual accelerated gradient methods with small-dimensional relaxation oracleSmoothness parameter of power of Euclidean normAccelerated schemes for a class of variational inequalitiesFast gradient methods for uniformly convex and weakly smooth problemsAn optimal subgradient algorithm with subspace search for costly convex optimization problemsEmpirical risk minimization: probabilistic complexity and stepsize strategyOptimal subgradient algorithms for large-scale convex optimization in simple domainsEfficiency of the Accelerated Coordinate Descent Method on Structured Optimization ProblemsOn the Adaptivity of Stochastic Gradient-Based OptimizationOptimal Affine-Invariant Smooth Minimization AlgorithmsAccelerated first-order methods for hyperbolic programmingStochastic Model-Based Minimization of Weakly Convex FunctionsUniversal Regularization Methods: Varying the Power, the Smoothness and the AccuracyThe Approximate Duality Gap Technique: A Unified Theory of First-Order MethodsA universal modification of the linear coupling methodImplementable tensor methods in unconstrained convex optimizationUniversal gradient methods for convex optimization problemsOn the quality of first-order approximation of functions with Hölder continuous gradientUniversal method for stochastic composite optimization problemsSolving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)Regularized Newton Methods for Minimizing Functions with Hölder Continuous HessiansFast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested pointAn accelerated directional derivative method for smooth stochastic convex optimizationNearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approachOn the properties of the method of minimization for convex functions with relaxation on the distance to extremumRegularized nonlinear accelerationDecentralized and parallel primal and dual accelerated methods for stochastic convex programming problemsSharpness, Restart, and AccelerationAccelerated Bregman proximal gradient methods for relatively smooth convex optimizationA block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorizationGeneralized Conditional Gradient with Augmented Lagrangian for Composite MinimizationAccelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexityUniversal method of searching for equilibria and stochastic equilibria in transportation networksOptimal subgradient methods: computational properties for large-scale linear inverse problemsA Subgradient Method for Free Material DesignQuasi-convex feasibility problems: subgradient methods and convergence ratesEfficiency of minimizing compositions of convex functions and smooth mapsAn adaptive proximal method for variational inequalitiesUnified Acceleration of High-Order Algorithms under General Hölder ContinuityThe method of codifferential descent for convex and global piecewise affine optimizationA dual approach for optimal algorithms in distributed optimization over networksInexact model: a framework for optimization and variational inequalitiesUniversal intermediate gradient method for convex problems with inexact oracleConvex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems


This page was built for software: NESUN