Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
From MaRDI portal
Publication:2910875
DOI10.1137/100802001zbMath1257.90073OpenAlexW2095984592MaRDI QIDQ2910875
No author found.
Publication date: 12 September 2012
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100802001
Related Items (only showing first 100 items - show all)
Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems ⋮ Block coordinate type methods for optimization and learning ⋮ Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications ⋮ A unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descent ⋮ Block-cyclic stochastic coordinate descent for deep neural networks ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods ⋮ Unified analysis of stochastic gradient methods for composite convex and smooth optimization ⋮ Parameter estimation in a 3‐parameter p‐star random graph model ⋮ Cluster‐based gradient method for stochastic optimal control problems with elliptic partial differential equation constraint ⋮ Adaptive coordinate sampling for stochastic primal–dual optimization ⋮ Block Policy Mirror Descent ⋮ Derivation of coordinate descent algorithms from optimal control theory ⋮ Stochastic mirror descent method for linear ill-posed problems in Banach spaces ⋮ Faster randomized block sparse Kaczmarz by averaging ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ Conjugate gradients acceleration of coordinate descent for linear systems ⋮ Importance sampling in signal processing applications ⋮ The method of randomized Bregman projections for stochastic feasibility problems ⋮ On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization ⋮ Block mirror stochastic gradient method for stochastic optimization ⋮ Global optimization using random embeddings ⋮ Convergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functions ⋮ First-order methods for convex optimization ⋮ Robust supervised learning with coordinate gradient descent ⋮ Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ A bicomposition of conical projections ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Rows versus Columns: Randomized Kaczmarz or Gauss--Seidel for Ridge Regression ⋮ Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems ⋮ Adaptive Catalyst for Smooth Convex Optimization ⋮ An introduction to continuous optimization for imaging ⋮ An acceleration procedure for optimal first-order methods ⋮ Laplacian-based semi-supervised learning in multilayer hypergraphs by coordinate descent ⋮ Coordinate descent methods beyond smoothness and separability ⋮ Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization ⋮ Worst case complexity of direct search under convexity ⋮ Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Stochastic subspace correction methods and fault tolerance ⋮ Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme ⋮ An alternating minimization method for robust principal component analysis ⋮ Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone ⋮ Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping ⋮ Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization ⋮ Direct Search Based on Probabilistic Descent ⋮ The Supporting Halfspace--Quadratic Programming Strategy for the Dual of the Best Approximation Problem ⋮ On the Convergence of Projected-Gradient Methods with Low-Rank Projections for Smooth Convex Minimization over Trace-Norm Balls and Related Problems ⋮ On Solving Large-Scale Polynomial Convex Problems by Randomized First-Order Algorithms ⋮ Convergence Analysis of Inexact Randomized Iterative Methods ⋮ On solving the densestk-subgraph problem on large graphs ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity ⋮ Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems ⋮ Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization ⋮ On the optimal order of worst case complexity of direct search ⋮ An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems ⋮ Solving norm constrained portfolio optimization via coordinate-wise descent algorithms ⋮ On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems ⋮ Stochastic accelerated alternating direction method of multipliers with importance sampling ⋮ On obtaining sparse semantic solutions for inverse problems, control, and neural network training ⋮ Using neural networks to accelerate the solution of the Boltzmann equation ⋮ Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems ⋮ Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis ⋮ An accelerated coordinate gradient descent algorithm for non-separable composite optimization ⋮ Oracle complexity separation in convex optimization ⋮ Inexact coordinate descent: complexity and preconditioning ⋮ A flexible coordinate descent method ⋮ On optimal probabilities in stochastic coordinate descent methods ⋮ Phase-only transmit beampattern design for large phased array antennas with multi-point nulling ⋮ Empirical likelihood confidence tubes for functional parameters in plug-in estimation ⋮ Sampling Kaczmarz-Motzkin method for linear feasibility problems: generalization and acceleration ⋮ Extended randomized Kaczmarz method for sparse least squares and impulsive noise problems ⋮ Exact worst-case convergence rates of the proximal gradient method for composite convex minimization ⋮ Block layer decomposition schemes for training deep neural networks ⋮ Accelerated sampling Kaczmarz Motzkin algorithm for the linear feasibility problem ⋮ Accelerated parallel and distributed algorithm using limited internal memory for nonnegative matrix factorization ⋮ On N. Z. Shor's three scientific ideas ⋮ A globally convergent algorithm for nonconvex optimization based on block coordinate update ⋮ iPiasco: inertial proximal algorithm for strongly convex optimization ⋮ Subgradient methods for huge-scale optimization problems ⋮ Randomness and permutations in coordinate descent methods ⋮ On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems ⋮ Subspace correction methods in algebraic multi-level frames ⋮ Duality and nonlinear graph Laplacians ⋮ Synchronous parallel block coordinate descent method for nonsmooth convex function minimization ⋮ On the relation between the randomized extended Kaczmarz algorithm and coordinate descent ⋮ Lower bounds for finding stationary points I ⋮ Efficient first-order methods for convex minimization: a constructive approach ⋮ Emergence of price-taking behavior ⋮ On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes ⋮ Generalization of a result of Fabian on the asymptotic normality of stochastic approximation ⋮ Linear convergence of the randomized sparse Kaczmarz method ⋮ A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints ⋮ Optimization for deep learning: an overview ⋮ The 2-coordinate descent method for solving double-sided simplex constrained minimization problems ⋮ Primal-dual block-proximal splitting for a class of non-convex problems ⋮ Blocks of coordinates, stochastic programming, and markets ⋮ Stochastic block-coordinate gradient projection algorithms for submodular maximization
This page was built for publication: Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems