Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
From MaRDI portal
Publication:1016411
DOI10.1007/S10957-008-9458-3zbMath1190.90279OpenAlexW2072265605MaRDI QIDQ1016411
Publication date: 5 May 2009
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.476.7692
global convergencesupport vector machinescomplexity boundbilevel optimization\(\ell_{1}\)-regularizationlinear convergence rate
Related Items (35)
Effects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networks ⋮ Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ A flexible coordinate descent method ⋮ Model Selection for Cox Models with Time-Varying Coefficients ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ Monotonicity and market equilibrium ⋮ Random block coordinate descent methods for linearly constrained optimization over networks ⋮ Approximation accuracy, gradient methods, and error bound for structured convex optimization ⋮ An alternating maximization method for approximating the hump of the matrix exponential ⋮ On Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate Analysis ⋮ Newton-MR: inexact Newton method with minimum residual sub-problem solver ⋮ A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming ⋮ Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming ⋮ A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints ⋮ Unnamed Item ⋮ Blocks of coordinates, stochastic programming, and markets ⋮ Stochastic block-coordinate gradient projection algorithms for submodular maximization ⋮ Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function ⋮ Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization ⋮ On the complexity analysis of randomized block-coordinate descent methods ⋮ A pseudo-heuristic parameter selection rule for \(l^1\)-regularized minimization problems ⋮ RSG: Beating Subgradient Method without Smoothness and Strong Convexity ⋮ Iteration complexity analysis of block coordinate descent methods ⋮ An introduction to continuous optimization for imaging ⋮ Block Coordinate Descent Methods for Semidefinite Programming ⋮ A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization ⋮ A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization ⋮ Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions ⋮ A COMPRESSED SENSING FRAMEWORK OF FREQUENCY-SPARSE SIGNALS THROUGH CHAOTIC SYSTEM ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Bilateral exchange and competitive equilibrium ⋮ On the convergence of inexact block coordinate descent methods for constrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Linear time algorithms for some separable quadratic programming problems
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- An O(n) algorithm for quadratic knapsack problems
- A coordinate gradient descent method for nonsmooth separable minimization
- Decomposition algorithm model for singly linearly-constrained problems subject to lower and Upper bounds
- A method for minimizing the sum of a convex function and a continuously differentiable function
- A minimization method for the sum of a convex function and a continuously differentiable function
- A successive quadratic programming method for a class of constrained nonsmooth optimization problems
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Polynomial-time decomposition algorithms for support vector machines
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Pathwise coordinate optimization
- On linear-time algorithms for the continuous quadratic Knapsack problem
- Atomic Decomposition by Basis Pursuit
- Exact Regularization of Convex Programs
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The Group Lasso for Logistic Regression
- A generalized proximal point algorithm for certain non-convex minimization problems
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Numerical Optimization
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Ideal spatial adaptation by wavelet shrinkage
- Iterative Solution of Nonlinear Equations in Several Variables
- Learning Theory
- Convex Analysis
- An ϵ-Out-of-Kilter Method for Monotropic Programming
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization