Gradient methods for minimizing composite functions
From MaRDI portal
Publication:359630
DOI10.1007/S10107-012-0629-5zbMATH Open1287.90067OpenAlexW2030161963MaRDI QIDQ359630FDOQ359630
Authors: Yuri Nesterov
Publication date: 12 August 2013
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-012-0629-5
Recommendations
Numerical mathematical programming methods (65K05) Convex programming (90C25) Fractional programming (90C32) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Just relax: convex programming methods for identifying sparse signals in noise
- Title not available (Why is that?)
- Title not available (Why is that?)
- A generalized proximal point algorithm for certain non-convex minimization problems
- Rounding of convex sets and efficient gradient methods for linear programming problems
- Linear Inversion of Band-Limited Reflection Seismograms
- Accelerating the cubic regularization of Newton's method on convex problems
Cited In (only showing first 100 items - show all)
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- Stochastic multilevel composition optimization algorithms with level-independent convergence rates
- An extrapolated iteratively reweighted \(\ell_1\) method with complexity analysis
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Generalized self-concordant functions: a recipe for Newton-type methods
- Generalized conditional gradient for sparse estimation
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- On variance reduction for stochastic smooth convex optimization with multiplicative noise
- Majorization-minimization generalized Krylov subspace methods for \({\ell _p}\)-\({\ell _q}\) optimization applied to image restoration
- On the convergence of the forward-backward splitting method with linesearches
- A level-set method for convex optimization with a feasible solution path
- A proximal difference-of-convex algorithm with extrapolation
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- Nesterov's smoothing technique and minimizing differences of convex functions for hierarchical clustering
- Decomposition in derivative-free optimization
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Stochastic model-based minimization of weakly convex functions
- Point process estimation with Mirror Prox algorithms
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
- Accelerated regularized Newton methods for minimizing composite convex functions
- Sharpness, restart, and acceleration
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Accelerated stochastic algorithms for convex-concave saddle-point problems
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Gradient methods for problems with inexact model of the objective
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- The condition number of a function relative to a set
- Iteratively reweighted \(\ell _1\) algorithms with extrapolation
- Efficiency of minimizing compositions of convex functions and smooth maps
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Inexact primal-dual gradient projection methods for nonlinear optimization on convex set
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Accelerated differential inclusion for convex optimization
- Optimal Transport Approximation of 2-Dimensional Measures
- Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm
- Complexity bounds for primal-dual methods minimizing the model of objective function
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- A single-phase, proximal path-following framework
- Iterative positive thresholding algorithm for non-negative sparse optimization
- The landscape of empirical risk for nonconvex losses
- MAGMA: multilevel accelerated gradient mirror descent algorithm for large-scale convex composite minimization
- On the quality of first-order approximation of functions with Hölder continuous gradient
- Stochastic proximal splitting algorithm for composite minimization
- A modified strictly contractive peaceman-Rachford splitting method for multi-block separable convex programming
- Accelerated first-order methods for hyperbolic programming
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Randomized block proximal damped Newton method for composite self-concordant minimization
- Title not available (Why is that?)
- Learnable descent algorithm for nonsmooth nonconvex image reconstruction
- A relaxed parameter condition for the primal-dual hybrid gradient method for saddle-point problem
- Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping
- Scaled, inexact, and adaptive generalized FISTA for strongly convex optimization
- Inexact model: a framework for optimization and variational inequalities
- On stochastic accelerated gradient with convergence rate
- Contracting proximal methods for smooth convex optimization
- Accelerated proximal envelopes: application to componentwise methods
- Nesterov perturbations and projection methods applied to IMRT
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Additive Schwarz methods for convex optimization as gradient methods
- Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization
- Title not available (Why is that?)
- Additive Schwarz methods for convex optimization with backtracking
- An accelerated communication-efficient primal-dual optimization framework for structured machine learning
- Complex-Valued Imaging with Total Variation Regularization: An Application to Full-Waveform Inversion in Visco-acoustic Media
- An alternating direction method of multipliers with a worst-case \(O(1/n^2)\) convergence rate
- A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm
- Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization
- Title not available (Why is that?)
- Composite convex minimization involving self-concordant-like cost functions
- A high-dimensional M-estimator framework for bi-level variable selection
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Title not available (Why is that?)
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Proximal methods avoid active strict saddles of weakly convex functions
- Efficient learning with a family of nonconvex regularizers by redistributing nonconvexity
- Variational image regularization with Euler's elastica using a discrete gradient scheme
- An average curvature accelerated composite gradient method for nonconvex smooth composite optimization problems
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems
- Optimal combination of tensor optimization methods
- Gradient method for optimization on Riemannian manifolds with lower bounded curvature
- Accelerated and unaccelerated stochastic gradient descent in model generality
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity separation in convex optimization
- A scalable estimator of sets of integral operators
- A dual approach for optimal algorithms in distributed optimization over networks
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- On the linear convergence rates of exchange and continuous methods for total variation minimization
Uses Software
This page was built for publication: Gradient methods for minimizing composite functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q359630)