Fast first-order methods for composite convex optimization with backtracking
From MaRDI portal
Recommendations
- Fast first-order methods for minimizing convex composite functions
- Exact worst-case performance of first-order methods for composite convex optimization
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- An adaptive accelerated first-order method for convex optimization
- A first-order splitting method for solving a large-scale composite convex optimization problem
- First-order methods for convex optimization
- Efficient first-order methods for convex minimization: a constructive approach
- Optimized first-order methods for smooth convex minimization
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Compressive sampling
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Fast alternating linearization methods for minimizing the sum of two convex functions
- Gradient methods for minimizing composite functions
- Introductory lectures on convex optimization. A basic course.
- Model Selection and Estimation in Regression with Grouped Variables
- Smooth minimization of non-smooth functions
Cited in
(30)- A scaled and adaptive FISTA algorithm for signal-dependent sparse image super-resolution problems
- An optimal high-order tensor method for convex optimization
- Parameter-free accelerated gradient descent for nonconvex minimization
- Scaled, inexact, and adaptive generalized FISTA for strongly convex optimization
- Accelerated linearized Bregman method
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Additive Schwarz methods for convex optimization with backtracking
- Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization
- Network classification with applications to brain connectomics
- On the interplay between acceleration and identification for the proximal gradient algorithm
- Accelerated Bregman operator splitting with backtracking
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Fast first-order methods for minimizing convex composite functions
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Inexact proximal Newton methods for self-concordant functions
- A variable metric and Nesterov extrapolated proximal DCA with backtracking for a composite DC program
- Parameter-free FISTA by adaptive restart and backtracking
- Complementary composite minimization, small gradients in general norms, and applications
- Fast alternating linearization methods for minimizing the sum of two convex functions
- An abstract convergence framework with application to inertial inexact forward-backward methods
- An accelerated forward-backward algorithm with a new linesearch for convex minimization problems and its applications
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Mirror Prox algorithm for multi-term composite minimization and semi-separable problems
- Proximal extrapolated gradient methods for variational inequalities
- Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
- An improved parameterized fast iterative shrinkage-thresholding algorithm with adaptive step size and its applications
- Inertial proximal incremental aggregated gradient method with linear convergence guarantees
- Convergence rate of inertial proximal algorithms with general extrapolation and proximal coefficients
- Exact gradient methods with memory
This page was built for publication: Fast first-order methods for composite convex optimization with backtracking
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q404292)