Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
From MaRDI portal
Publication:2910875
DOI10.1137/100802001zbMath1257.90073OpenAlexW2095984592MaRDI QIDQ2910875
No author found.
Publication date: 12 September 2012
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100802001
Related Items (only showing first 100 items - show all)
Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems ⋮ Block coordinate type methods for optimization and learning ⋮ Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications ⋮ A unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descent ⋮ Block-cyclic stochastic coordinate descent for deep neural networks ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods ⋮ Unified analysis of stochastic gradient methods for composite convex and smooth optimization ⋮ Parameter estimation in a 3‐parameter p‐star random graph model ⋮ Cluster‐based gradient method for stochastic optimal control problems with elliptic partial differential equation constraint ⋮ Adaptive coordinate sampling for stochastic primal–dual optimization ⋮ Block Policy Mirror Descent ⋮ Derivation of coordinate descent algorithms from optimal control theory ⋮ Stochastic mirror descent method for linear ill-posed problems in Banach spaces ⋮ Faster randomized block sparse Kaczmarz by averaging ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ Conjugate gradients acceleration of coordinate descent for linear systems ⋮ Importance sampling in signal processing applications ⋮ The method of randomized Bregman projections for stochastic feasibility problems ⋮ On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization ⋮ Block mirror stochastic gradient method for stochastic optimization ⋮ Global optimization using random embeddings ⋮ Convergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functions ⋮ First-order methods for convex optimization ⋮ Robust supervised learning with coordinate gradient descent ⋮ Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ A bicomposition of conical projections ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Rows versus Columns: Randomized Kaczmarz or Gauss--Seidel for Ridge Regression ⋮ Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems ⋮ Adaptive Catalyst for Smooth Convex Optimization ⋮ An introduction to continuous optimization for imaging ⋮ An acceleration procedure for optimal first-order methods ⋮ Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization ⋮ Worst case complexity of direct search under convexity ⋮ Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Stochastic subspace correction methods and fault tolerance ⋮ Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme ⋮ An alternating minimization method for robust principal component analysis ⋮ Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone ⋮ Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping ⋮ Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization ⋮ Direct Search Based on Probabilistic Descent ⋮ The Supporting Halfspace--Quadratic Programming Strategy for the Dual of the Best Approximation Problem ⋮ On the Convergence of Projected-Gradient Methods with Low-Rank Projections for Smooth Convex Minimization over Trace-Norm Balls and Related Problems ⋮ On Solving Large-Scale Polynomial Convex Problems by Randomized First-Order Algorithms ⋮ Convergence Analysis of Inexact Randomized Iterative Methods ⋮ On solving the densestk-subgraph problem on large graphs ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity ⋮ Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems ⋮ Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization ⋮ Parallel block coordinate minimization with application to group regularized regression ⋮ A derivative-free affine scaling trust region methods based on probabilistic models with new nonmonotone line search technique for linear inequality constrained minimization without strict complementarity ⋮ Unsupervised learning of pharmacokinetic responses ⋮ On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions ⋮ Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes ⋮ Efficient numerical methods to solve sparse linear equations with application to PageRank ⋮ Accelerated, Parallel, and Proximal Coordinate Descent ⋮ The Cyclic Block Conditional Gradient Method for Convex Optimization Problems ⋮ An accelerated randomized Kaczmarz algorithm ⋮ Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems ⋮ On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ Convergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz Methods ⋮ A proximal block minimization method of multipliers with a substitution procedure ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints ⋮ Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds ⋮ Iterative positive thresholding algorithm for non-negative sparse optimization ⋮ Efficient block-coordinate descent algorithms for the group Lasso ⋮ Random gradient-free minimization of convex functions ⋮ Unnamed Item ⋮ Random block coordinate descent methods for linearly constrained optimization over networks ⋮ On the Convergence of Stochastic Primal-Dual Hybrid Gradient ⋮ Separable approximations and decomposition methods for the augmented Lagrangian ⋮ New method for solving Ivanov regularization-based support vector machine learning ⋮ Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization ⋮ slimTrain---A Stochastic Approximation Method for Training Separable Deep Neural Networks ⋮ Nearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergence ⋮ A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications ⋮ Accelerating block coordinate descent methods with identification strategies ⋮ An almost cyclic 2-coordinate descent method for singly linearly constrained problems ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization ⋮ On the convergence of asynchronous parallel iteration with unbounded delays ⋮ On Synchronous, Asynchronous, and Randomized Best-Response Schemes for Stochastic Nash Games ⋮ On the Efficiency of Random Permutation for ADMM and Coordinate Descent ⋮ Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems ⋮ A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming ⋮ Unnamed Item ⋮ Analysis of the Block Coordinate Descent Method for Linear Ill-Posed Problems ⋮ On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems ⋮ Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory ⋮ Analyzing random permutations for cyclic coordinate descent ⋮ Convergence analysis of the Fast Subspace Descent method for convex optimization problems ⋮ Batched Stochastic Gradient Descent with Weighted Sampling ⋮ Avoiding Communication in Primal and Dual Block Coordinate Descent Methods
This page was built for publication: Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems