Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization

From MaRDI portal
Revision as of 21:40, 30 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1016411

DOI10.1007/S10957-008-9458-3zbMath1190.90279OpenAlexW2072265605MaRDI QIDQ1016411

Sangwoon Yun, Paul Tseng

Publication date: 5 May 2009

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.476.7692




Related Items (35)

Effects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networksBlock-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problemsAn Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk MinimizationA flexible coordinate descent methodModel Selection for Cox Models with Time-Varying CoefficientsDistributed Block Coordinate Descent for Minimizing Partially Separable FunctionsMonotonicity and market equilibriumRandom block coordinate descent methods for linearly constrained optimization over networksApproximation accuracy, gradient methods, and error bound for structured convex optimizationAn alternating maximization method for approximating the hump of the matrix exponentialOn Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate AnalysisNewton-MR: inexact Newton method with minimum residual sub-problem solverA Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear ProgrammingRandom Coordinate Descent Methods for Nonseparable Composite OptimizationLocal linear convergence of proximal coordinate descent algorithmHybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex ProgrammingA random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraintsUnnamed ItemBlocks of coordinates, stochastic programming, and marketsStochastic block-coordinate gradient projection algorithms for submodular maximizationIteration complexity of randomized block-coordinate descent methods for minimizing a composite functionEfficient random coordinate descent algorithms for large-scale structured nonconvex optimizationOn the complexity analysis of randomized block-coordinate descent methodsA pseudo-heuristic parameter selection rule for \(l^1\)-regularized minimization problemsRSG: Beating Subgradient Method without Smoothness and Strong ConvexityIteration complexity analysis of block coordinate descent methodsAn introduction to continuous optimization for imagingBlock Coordinate Descent Methods for Semidefinite ProgrammingA coordinate descent homotopy method for linearly constrained nonsmooth convex minimizationA Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex OptimizationDykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functionsA COMPRESSED SENSING FRAMEWORK OF FREQUENCY-SPARSE SIGNALS THROUGH CHAOTIC SYSTEMA generic coordinate descent solver for non-smooth convex optimisationBilateral exchange and competitive equilibriumOn the convergence of inexact block coordinate descent methods for constrained optimization


Uses Software



Cites Work




This page was built for publication: Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization