Gradient methods for minimizing composite functions
From MaRDI portal
Publication:359630
DOI10.1007/S10107-012-0629-5zbMATH Open1287.90067OpenAlexW2030161963MaRDI QIDQ359630FDOQ359630
Authors: Yuri Nesterov
Publication date: 12 August 2013
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-012-0629-5
Recommendations
Numerical mathematical programming methods (65K05) Convex programming (90C25) Fractional programming (90C32) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Just relax: convex programming methods for identifying sparse signals in noise
- Title not available (Why is that?)
- Title not available (Why is that?)
- A generalized proximal point algorithm for certain non-convex minimization problems
- Rounding of convex sets and efficient gradient methods for linear programming problems
- Linear Inversion of Band-Limited Reflection Seismograms
- Accelerating the cubic regularization of Newton's method on convex problems
Cited In (only showing first 100 items - show all)
- Learnable descent algorithm for nonsmooth nonconvex image reconstruction
- A relaxed parameter condition for the primal-dual hybrid gradient method for saddle-point problem
- Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping
- Scaled, inexact, and adaptive generalized FISTA for strongly convex optimization
- Inexact model: a framework for optimization and variational inequalities
- On stochastic accelerated gradient with convergence rate
- Contracting proximal methods for smooth convex optimization
- Accelerated proximal envelopes: application to componentwise methods
- Nesterov perturbations and projection methods applied to IMRT
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Additive Schwarz methods for convex optimization as gradient methods
- Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization
- Title not available (Why is that?)
- Additive Schwarz methods for convex optimization with backtracking
- An accelerated communication-efficient primal-dual optimization framework for structured machine learning
- Complex-Valued Imaging with Total Variation Regularization: An Application to Full-Waveform Inversion in Visco-acoustic Media
- An alternating direction method of multipliers with a worst-case \(O(1/n^2)\) convergence rate
- A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm
- Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization
- Title not available (Why is that?)
- Composite convex minimization involving self-concordant-like cost functions
- A high-dimensional M-estimator framework for bi-level variable selection
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Title not available (Why is that?)
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Proximal methods avoid active strict saddles of weakly convex functions
- Efficient learning with a family of nonconvex regularizers by redistributing nonconvexity
- Variational image regularization with Euler's elastica using a discrete gradient scheme
- An average curvature accelerated composite gradient method for nonconvex smooth composite optimization problems
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems
- Optimal combination of tensor optimization methods
- Gradient method for optimization on Riemannian manifolds with lower bounded curvature
- Accelerated and unaccelerated stochastic gradient descent in model generality
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity separation in convex optimization
- A scalable estimator of sets of integral operators
- A dual approach for optimal algorithms in distributed optimization over networks
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- On the linear convergence rates of exchange and continuous methods for total variation minimization
- Solving large-scale optimization problems with a convergence rate independent of grid size
- Iteration complexity of generalized complementarity problems
- A piecewise conservative method for unconstrained convex optimization
- Solving convex min-min problems with smoothness and strong convexity in one group of variables and low dimension in the other
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Finite element approximation of source term identification with TV-regularization
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Scheduled restart momentum for accelerated stochastic gradient descent
- Perturbed Fenchel duality and first-order methods
- Mining events with declassified diplomatic documents
- High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- The common-directions method for regularized empirical risk minimization
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization
- Multilevel composite stochastic optimization via nested variance reduction
- Strong Convergence of Trajectories via Inertial Dynamics Combining Hessian-Driven Damping and Tikhonov Regularization for General Convex Minimizations
- Inertial proximal incremental aggregated gradient method with linear convergence guarantees
- Accelerated proximal point method for maximally monotone operators
- Nonregular and minimax estimation of individualized thresholds in high dimension with binary responses
- Accelerated gradient sliding for structured convex optimization
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- Stochastic multilevel composition optimization algorithms with level-independent convergence rates
- An extrapolated iteratively reweighted \(\ell_1\) method with complexity analysis
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Generalized self-concordant functions: a recipe for Newton-type methods
- Generalized conditional gradient for sparse estimation
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- On variance reduction for stochastic smooth convex optimization with multiplicative noise
- Majorization-minimization generalized Krylov subspace methods for \({\ell _p}\)-\({\ell _q}\) optimization applied to image restoration
- On the convergence of the forward-backward splitting method with linesearches
- A level-set method for convex optimization with a feasible solution path
- A proximal difference-of-convex algorithm with extrapolation
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- Nesterov's smoothing technique and minimizing differences of convex functions for hierarchical clustering
- Decomposition in derivative-free optimization
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Stochastic model-based minimization of weakly convex functions
- Point process estimation with Mirror Prox algorithms
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
- Accelerated regularized Newton methods for minimizing composite convex functions
- Sharpness, restart, and acceleration
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Accelerated stochastic algorithms for convex-concave saddle-point problems
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Gradient methods for problems with inexact model of the objective
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- The condition number of a function relative to a set
- Iteratively reweighted \(\ell _1\) algorithms with extrapolation
Uses Software
This page was built for publication: Gradient methods for minimizing composite functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q359630)