Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems

From MaRDI portal
Revision as of 20:12, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2910875

DOI10.1137/100802001zbMath1257.90073OpenAlexW2095984592MaRDI QIDQ2910875

No author found.

Publication date: 12 September 2012

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/100802001




Related Items (only showing first 100 items - show all)

Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter ProblemsBlock coordinate type methods for optimization and learningCyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and ApplicationsA unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descentBlock-cyclic stochastic coordinate descent for deep neural networksCyclic Coordinate Dual Averaging with ExtrapolationFactor-\(\sqrt{2}\) acceleration of accelerated gradient methodsUnified analysis of stochastic gradient methods for composite convex and smooth optimizationParameter estimation in a 3‐parameter p‐star random graph modelCluster‐based gradient method for stochastic optimal control problems with elliptic partial differential equation constraintAdaptive coordinate sampling for stochastic primal–dual optimizationBlock Policy Mirror DescentDerivation of coordinate descent algorithms from optimal control theoryStochastic mirror descent method for linear ill-posed problems in Banach spacesFaster randomized block sparse Kaczmarz by averagingRandomized Block Proximal Damped Newton Method for Composite Self-Concordant MinimizationConjugate gradients acceleration of coordinate descent for linear systemsImportance sampling in signal processing applicationsThe method of randomized Bregman projections for stochastic feasibility problemsOn the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex OptimizationBlock mirror stochastic gradient method for stochastic optimizationGlobal optimization using random embeddingsConvergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functionsFirst-order methods for convex optimizationRobust supervised learning with coordinate gradient descentRandom Coordinate Descent Methods for Nonseparable Composite OptimizationLocal linear convergence of proximal coordinate descent algorithmA bicomposition of conical projectionsUnifying framework for accelerated randomized methods in convex optimizationRows versus Columns: Randomized Kaczmarz or Gauss--Seidel for Ridge RegressionFaster convergence of a randomized coordinate descent method for linearly constrained optimization problemsAdaptive Catalyst for Smooth Convex OptimizationAn introduction to continuous optimization for imagingAn acceleration procedure for optimal first-order methodsAccelerated proximal stochastic dual coordinate ascent for regularized loss minimizationWorst case complexity of direct search under convexityStochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithmAn Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton AccelerationStochastic subspace correction methods and fault toleranceVariational Image Regularization with Euler's Elastica Using a Discrete Gradient SchemeAn alternating minimization method for robust principal component analysisComputing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative ConeStochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random SweepingIteration Complexity of a Block Coordinate Gradient Descent Method for Convex OptimizationDirect Search Based on Probabilistic DescentThe Supporting Halfspace--Quadratic Programming Strategy for the Dual of the Best Approximation ProblemOn the Convergence of Projected-Gradient Methods with Low-Rank Projections for Smooth Convex Minimization over Trace-Norm Balls and Related ProblemsOn Solving Large-Scale Polynomial Convex Problems by Randomized First-Order AlgorithmsConvergence Analysis of Inexact Randomized Iterative MethodsOn solving the densestk-subgraph problem on large graphsA generic coordinate descent solver for non-smooth convex optimisationBregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient ContinuityLinear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex ProblemsExact Worst-Case Performance of First-Order Methods for Composite Convex OptimizationParallel block coordinate minimization with application to group regularized regressionA derivative-free affine scaling trust region methods based on probabilistic models with new nonmonotone line search technique for linear inequality constrained minimization without strict complementarityUnsupervised learning of pharmacokinetic responsesOn proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functionsAsynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizesEfficient numerical methods to solve sparse linear equations with application to PageRankAccelerated, Parallel, and Proximal Coordinate DescentThe Cyclic Block Conditional Gradient Method for Convex Optimization ProblemsAn accelerated randomized Kaczmarz algorithmPerformance of first- and second-order methods for \(\ell_1\)-regularized least squares problemsOn the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate AscentAn Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk MinimizationConvergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz MethodsA proximal block minimization method of multipliers with a substitution procedureDistributed Block Coordinate Descent for Minimizing Partially Separable FunctionsAnalysis and Design of Optimization Algorithms via Integral Quadratic ConstraintsParallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error BoundsIterative positive thresholding algorithm for non-negative sparse optimizationEfficient block-coordinate descent algorithms for the group LassoRandom gradient-free minimization of convex functionsUnnamed ItemRandom block coordinate descent methods for linearly constrained optimization over networksOn the Convergence of Stochastic Primal-Dual Hybrid GradientSeparable approximations and decomposition methods for the augmented LagrangianNew method for solving Ivanov regularization-based support vector machine learningBlock Stochastic Gradient Iteration for Convex and Nonconvex OptimizationslimTrain---A Stochastic Approximation Method for Training Separable Deep Neural NetworksNearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergenceA block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applicationsAccelerating block coordinate descent methods with identification strategiesAn almost cyclic 2-coordinate descent method for singly linearly constrained problemsAsynchronous Stochastic Coordinate Descent: Parallelism and Convergence PropertiesStochastic Block Mirror Descent Methods for Nonsmooth and Stochastic OptimizationOn the convergence of asynchronous parallel iteration with unbounded delaysOn Synchronous, Asynchronous, and Randomized Best-Response Schemes for Stochastic Nash GamesOn the Efficiency of Random Permutation for ADMM and Coordinate DescentEfficiency of the Accelerated Coordinate Descent Method on Structured Optimization ProblemsA Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear ProgrammingUnnamed ItemAnalysis of the Block Coordinate Descent Method for Linear Ill-Posed ProblemsOn multi-step greedy randomized coordinate descent method for solving large linear least-squares problemsStochastic Reformulations of Linear Systems: Algorithms and Convergence TheoryAnalyzing random permutations for cyclic coordinate descentConvergence analysis of the Fast Subspace Descent method for convex optimization problemsBatched Stochastic Gradient Descent with Weighted SamplingAvoiding Communication in Primal and Dual Block Coordinate Descent Methods




This page was built for publication: Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems