Relatively Smooth Convex Optimization by First-Order Methods, and Applications

From MaRDI portal
Publication:4603043

DOI10.1137/16M1099546zbMath1392.90090arXiv1610.05708OpenAlexW2535496140MaRDI QIDQ4603043

Robert M. Freund, Haihao Lu, Yu. E. Nesterov

Publication date: 14 February 2018

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1610.05708



Related Items

Block Bregman Majorization Minimization with Extrapolation, Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems, Tensor methods for finding approximate stationary points of convex functions, Bregman proximal gradient algorithms for deep matrix factorization, Inexact basic tensor methods for some classes of convex optimization problems, Gradient methods with memory, Exact gradient methods with memory, Revisiting linearized Bregman iterations under Lipschitz-like convexity condition, Optimal complexity and certification of Bregman first-order methods, Global convergence of model function based Bregman proximal minimization algorithms, Extragradient and extrapolation methods with generalized Bregman distances for saddle point problems, Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation, Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC, Screening for a reweighted penalized conditional gradient method, Finitely determined functions, Continuous-Time Convergence Rates in Potential and Monotone Games, Superfast second-order methods for unconstrained convex optimization, On the linear convergence of a Bregman proximal point algorithm, Bregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial Variant, Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier, Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient, Convergence of the exponentiated gradient method with Armijo line search, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Affine Invariant Convergence Rates of the Conditional Gradient Method, Dualities for Non-Euclidean Smoothness and Strong Convexity under the Light of Generalized Conjugacy, Numerical methods for some classes of variational inequalities with relatively strongly monotone operators, A stochastic variance reduction algorithm with Bregman distances for structured composite problems, Stochastic composition optimization of functions without Lipschitz continuous gradient, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, No-regret algorithms in on-line learning, games and convex optimization, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Super-Universal Regularized Newton Method, Affine-invariant contracting-point methods for convex optimization, Perturbed Fenchel duality and first-order methods, Convergence rate analysis of the multiplicative gradient method for PET-type problems, An alternating structure-adapted Bregman proximal gradient descent algorithm for constrained nonconvex nonsmooth optimization problems and its inertial variant, Provable Phase Retrieval with Mirror Descent, Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, First-order methods for convex optimization, Adaptive Third-Order Methods for Composite Convex Optimization, Data-Driven Mirror Descent with Input-Convex Neural Networks, Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Inexact accelerated high-order proximal-point methods, Convergence Analysis for Bregman Iterations in Minimizing a Class of Landau Free Energy Functionals, A simple nearly optimal restart scheme for speeding up first-order methods, Golden ratio algorithms for variational inequalities, Bregman three-operator splitting methods, Contracting Proximal Methods for Smooth Convex Optimization, Implementable tensor methods in unconstrained convex optimization, Nonlinear Forward-Backward Splitting with Projection Correction, Point process estimation with Mirror Prox algorithms, New characterizations of Hoffman constants for systems of linear constraints, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems, Unnamed Item, Algorithms for nonnegative matrix factorization with the Kullback-Leibler divergence, Quartic first-order methods for low-rank minimization, The condition number of a function relative to a set, Sharpness, Restart, and Acceleration, Accelerated Bregman proximal gradient methods for relatively smooth convex optimization, A Laplacian approach to \(\ell_1\)-norm minimization, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization, Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems, Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity, An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions, Cyclic coordinate descent in the Hölder smooth setting, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, A dual Bregman proximal gradient method for relatively-strongly convex optimization, Dual Space Preconditioning for Gradient Descent, On inexact solution of auxiliary problems in tensor methods for convex optimization, Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems, Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure, Inexact model: a framework for optimization and variational inequalities, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity, An Optimal High-Order Tensor Method for Convex Optimization


Uses Software


Cites Work