Fast first-order methods for composite convex optimization with backtracking
From MaRDI portal
Publication:404292
DOI10.1007/s10208-014-9189-9zbMath1304.90161OpenAlexW1997616626MaRDI QIDQ404292
Xi Bai, Katya Scheinberg, Donald Goldfarb
Publication date: 4 September 2014
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10208-014-9189-9
Related Items
Additive Schwarz methods for convex optimization with backtracking, A scaled and adaptive FISTA algorithm for signal-dependent sparse image super-resolution problems, Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization, An accelerated forward-backward algorithm with a new linesearch for convex minimization problems and its applications, Second order semi-smooth proximal Newton methods in Hilbert spaces, Exact gradient methods with memory, Accelerated Bregman operator splitting with backtracking, Fast alternating linearization methods for minimizing the sum of two convex functions, Convergence rate of inertial proximal algorithms with general extrapolation and proximal coefficients, A variable metric and Nesterov extrapolated proximal DCA with backtracking for a composite DC program, Accelerated linearized Bregman method, An abstract convergence framework with application to inertial inexact forward-backward methods, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Proximal extrapolated gradient methods for variational inequalities, Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates, Inexact proximal Newton methods for self-concordant functions, On the interplay between acceleration and identification for the proximal gradient algorithm, Globalized inexact proximal Newton-type methods for nonconvex composite functions, Network classification with applications to brain connectomics, Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives, Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems, Inertial proximal incremental aggregated gradient method with linear convergence guarantees, Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization, Mirror Prox algorithm for multi-term composite minimization and semi-separable problems, An Optimal High-Order Tensor Method for Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Fast alternating linearization methods for minimizing the sum of two convex functions
- Introductory lectures on convex optimization. A basic course.
- Model Selection and Estimation in Regression with Grouped Variables