Gradient methods for minimizing composite functions
From MaRDI portal
Publication:359630
DOI10.1007/S10107-012-0629-5zbMATH Open1287.90067OpenAlexW2030161963MaRDI QIDQ359630FDOQ359630
Authors: Yuri Nesterov
Publication date: 12 August 2013
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-012-0629-5
Recommendations
Numerical mathematical programming methods (65K05) Convex programming (90C25) Fractional programming (90C32) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Just relax: convex programming methods for identifying sparse signals in noise
- Title not available (Why is that?)
- Title not available (Why is that?)
- A generalized proximal point algorithm for certain non-convex minimization problems
- Rounding of convex sets and efficient gradient methods for linear programming problems
- Linear Inversion of Band-Limited Reflection Seismograms
- Accelerating the cubic regularization of Newton's method on convex problems
Cited In (only showing first 100 items - show all)
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- Stochastic multilevel composition optimization algorithms with level-independent convergence rates
- An extrapolated iteratively reweighted \(\ell_1\) method with complexity analysis
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Generalized self-concordant functions: a recipe for Newton-type methods
- Generalized conditional gradient for sparse estimation
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- On variance reduction for stochastic smooth convex optimization with multiplicative noise
- Majorization-minimization generalized Krylov subspace methods for \({\ell _p}\)-\({\ell _q}\) optimization applied to image restoration
- On the convergence of the forward-backward splitting method with linesearches
- A level-set method for convex optimization with a feasible solution path
- A proximal difference-of-convex algorithm with extrapolation
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- Nesterov's smoothing technique and minimizing differences of convex functions for hierarchical clustering
- Decomposition in derivative-free optimization
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Stochastic model-based minimization of weakly convex functions
- Point process estimation with Mirror Prox algorithms
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
- Accelerated regularized Newton methods for minimizing composite convex functions
- Sharpness, restart, and acceleration
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Accelerated stochastic algorithms for convex-concave saddle-point problems
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Gradient methods for problems with inexact model of the objective
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- The condition number of a function relative to a set
- Iteratively reweighted \(\ell _1\) algorithms with extrapolation
- Efficiency of minimizing compositions of convex functions and smooth maps
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Inexact primal-dual gradient projection methods for nonlinear optimization on convex set
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Accelerated differential inclusion for convex optimization
- Optimal Transport Approximation of 2-Dimensional Measures
- Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm
- Complexity bounds for primal-dual methods minimizing the model of objective function
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- A single-phase, proximal path-following framework
- Iterative positive thresholding algorithm for non-negative sparse optimization
- The landscape of empirical risk for nonconvex losses
- MAGMA: multilevel accelerated gradient mirror descent algorithm for large-scale convex composite minimization
- On the quality of first-order approximation of functions with Hölder continuous gradient
- Stochastic proximal splitting algorithm for composite minimization
- A modified strictly contractive peaceman-Rachford splitting method for multi-block separable convex programming
- Accelerated first-order methods for hyperbolic programming
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Randomized block proximal damped Newton method for composite self-concordant minimization
- Title not available (Why is that?)
- A family of subgradient-based methods for convex optimization problems in a unifying framework
- Complexity of a quadratic penalty accelerated inexact proximal point method for solving linearly constrained nonconvex composite programs
- Proximal Newton-type methods for minimizing composite functions
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Title not available (Why is that?)
- Block-wise ADMM with a relaxation factor for multiple-block convex programming
- PPA-like contraction methods for convex optimization: a framework using variational inequality approach
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Parallel coordinate descent methods for big data optimization
- Accelerated directional search with non-Euclidean prox-structure
- Block-simultaneous direction method of multipliers: a proximal primal-dual splitting algorithm for nonconvex problems with multiple constraints
- Inexact coordinate descent: complexity and preconditioning
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization
- Analysis and design of optimization algorithms via integral quadratic constraints
- Linear coupling: an ultimate unification of gradient and mirror descent
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Hierarchical sparse modeling: a choice of two group Lasso formulations
- First-order methods for convex optimization
- General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Proximal alternating penalty algorithms for nonsmooth constrained convex optimization
- Accelerated gradient boosting
- Distributed proximal-gradient method for convex optimization with inequality constraints
- Exact worst-case performance of first-order methods for composite convex optimization
- Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Matrix completion via max-norm constrained optimization
- OSGA: a fast subgradient algorithm with optimal complexity
- Fast first-order methods for minimizing convex composite functions
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- A Trust-region Method for Nonsmooth Nonconvex Optimization
- Universal gradient methods for convex optimization problems
- A dual method for minimizing a nonsmooth objective over one smooth inequality constraint
- Gradient sliding for composite optimization
- Optimized first-order methods for smooth convex minimization
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Efficient first-order methods for convex minimization: a constructive approach
- Subsampled nonmonotone spectral gradient methods
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- On the complexity analysis of randomized block-coordinate descent methods
Uses Software
This page was built for publication: Gradient methods for minimizing composite functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q359630)