A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
From MaRDI portal
Publication:5131958
DOI10.1137/19M1286025zbMath1453.90121arXiv1811.02427MaRDI QIDQ5131958
Bo Jiang, Shu-Zhong Zhang, Tian-Yi Lin
Publication date: 9 November 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.02427
Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Approximation methods and heuristics in mathematical programming (90C59)
Related Items
Cubic regularized Newton method for the saddle point models: a global and local convergence analysis, Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation, Adaptive Third-Order Methods for Composite Convex Optimization, An adaptive high order method for finding third-order critical points of nonconvex optimization, A control-theoretic perspective on optimal high-order optimization, An Optimal High-Order Tensor Method for Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An adaptive accelerated first-order method for convex optimization
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Gradient methods for minimizing composite functions
- Fast first-order methods for composite convex optimization with backtracking
- On cones of nonnegative quartic forms
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- NP-hardness of deciding convexity of quartic polynomials and related problems
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Linear and nonlinear programming
- Cubic regularization of Newton method and its global performance
- Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- An Optimal High-Order Tensor Method for Convex Optimization
- An investigation of Newton-Sketch and subsampled Newton methods
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- Exact and inexact subsampled Newton methods for optimization
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization