DOI10.1007/s10107-012-0614-zzbMath1301.65051arXiv1107.2848OpenAlexW2117686388WikidataQ56481383 ScholiaQ56481383MaRDI QIDQ2452370
Martin Takáč, Peter Richtárik
Publication date: 2 June 2014
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1107.2848
Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes,
Block coordinate type methods for optimization and learning,
On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent,
A Fast Block Coordinate Descent Method for Solving Linear Least-Squares Problems,
A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization,
On the Convergence of Stochastic Primal-Dual Hybrid Gradient,
A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure,
slimTrain---A Stochastic Approximation Method for Training Separable Deep Neural Networks,
Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications,
Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists,
Block-cyclic stochastic coordinate descent for deep neural networks,
Non-smooth setting of stochastic decentralized convex optimization problem over time-varying graphs,
Adaptive coordinate sampling for stochastic primal–dual optimization,
Block Policy Mirror Descent,
Stochastic mirror descent method for linear ill-posed problems in Banach spaces,
Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization,
On Synchronous, Asynchronous, and Randomized Best-Response Schemes for Stochastic Nash Games,
Unnamed Item,
Importance sampling in signal processing applications,
On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization,
Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory,
Random Coordinate Descent Methods for Nonseparable Composite Optimization,
Local linear convergence of proximal coordinate descent algorithm,
A Sampling Kaczmarz--Motzkin Algorithm for Linear Feasibility,
Rows versus Columns: Randomized Kaczmarz or Gauss--Seidel for Ridge Regression,
A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems,
Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems,
Asynchronous Schemes for Stochastic and Misspecified Potential Games and Nonconvex Optimization,
Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model,
On the complexity of parallel coordinate descent,
A Method with Convergence Rates for Optimization Problems with Variational Inequality Constraints,
Unnamed Item,
A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions,
Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version,
Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications,
Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization,
Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm,
Randomized Gradient Boosting Machine,
CoordinateWise Descent Methods for Leading Eigenvalue Problem,
An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration,
Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization,
Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone,
On Adaptive Sketch-and-Project for Solving Linear Systems,
Unnamed Item,
Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping,
Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization,
Direct Search Based on Probabilistic Descent,
Unnamed Item,
Convergence Analysis of Inexact Randomized Iterative Methods,
On solving the densestk-subgraph problem on large graphs,
Unnamed Item,
A generic coordinate descent solver for non-smooth convex optimisation,
Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems,
Signal Decomposition Using Masked Proximal Operators,
Proximal Gradient Methods with Adaptive Subspace Sampling,
Control analysis and design via randomised coordinate polynomial minimisation,
Parallel block coordinate minimization with application to group regularized regression,
Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection,
Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems,
Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis,
Accelerated, Parallel, and Proximal Coordinate Descent,
The Cyclic Block Conditional Gradient Method for Convex Optimization Problems,
Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems,
An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization,
Inexact coordinate descent: complexity and preconditioning,
A flexible coordinate descent method,
On optimal probabilities in stochastic coordinate descent methods,
Convergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz Methods,
Stochastic optimization using a trust-region method and random models,
Distributed Block Coordinate Descent for Minimizing Partially Separable Functions,
Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee,
Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds,
Non-convex low-rank representation combined with rank-one matrix sum for subspace clustering,
Practical inexact proximal quasi-Newton method with global complexity analysis,
Random block coordinate descent methods for linearly constrained optimization over networks,
Distributed constraint-coupled optimization via primal decomposition over random time-varying graphs,
A globally convergent algorithm for nonconvex optimization based on block coordinate update,
Poisoning finite-horizon Markov decision processes at design time,
Separable approximations and decomposition methods for the augmented Lagrangian,
Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization,
Multi-label optimal margin distribution machine,
A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications,
Adaptive block coordinate DIRECT algorithm,
Block coordinate descent algorithms for large-scale sparse multiclass classification,
Accelerating block coordinate descent methods with identification strategies,
A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression,
Subgradient methods for huge-scale optimization problems,
Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties,
Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization,
On the convergence of asynchronous parallel iteration with unbounded delays,
A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming,
First-order methods almost always avoid strict saddle points,
On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems,
Synchronous parallel block coordinate descent method for nonsmooth convex function minimization,
Greedy and randomized versions of the multiplicative Schwarz method,
On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes,
Avoiding Communication in Primal and Dual Block Coordinate Descent Methods,
Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights,
Linear convergence of the randomized sparse Kaczmarz method,
Stochastic block-coordinate gradient projection algorithms for submodular maximization,
Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version,
Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization,
Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs,
A Randomized Coordinate Descent Method with Volume Sampling,
On the complexity analysis of randomized block-coordinate descent methods,
Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. II: Mean-square and linear convergence,
Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup,
Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates,
Pathwise coordinate optimization for sparse learning: algorithm and theory,
Decomposable norm minimization with proximal-gradient homotopy algorithm,
Multi-label Lagrangian support vector machine with random block coordinate descent method,
Matrix completion under interval uncertainty,
Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization,
On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization,
Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice,
An extragradient-based alternating direction method for convex minimization,
Iteration complexity analysis of block coordinate descent methods,
On Motzkin's method for inconsistent linear systems,
Schwarz iterative methods: infinite space splittings,
Point process estimation with Mirror Prox algorithms,
Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Unnamed Item, Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods, A block coordinate variable metric forward-backward algorithm, A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, Stochastic quasi-gradient methods: variance reduction via Jacobian sketching, A parallel line search subspace correction method for composite convex optimization, On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems, A stochastic subspace approach to gradient-free optimization in high dimensions, Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, Fastest rates for stochastic mirror descent methods, Computational Methods for Solving Nonconvex Block-Separable Constrained Quadratic Problems, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, A second-order method for convex1-regularized optimization with active-set prediction, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, Iterative Proportional Scaling Revisited: A Modern Optimization Perspective, Gauss-Seidel method with oblique direction, Variant of greedy randomized Kaczmarz for ridge regression, Asynchronous networked aggregative games, Coordinate descent with arbitrary sampling I: algorithms and complexity†, Coordinate descent with arbitrary sampling II: expected separable overapproximation, Randomized primal-dual proximal block coordinate updates, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Randomized and fault-tolerant method of subspace corrections, On relaxed greedy randomized iterative methods for the solution of factorized linear systems, Cyclic coordinate descent in the Hölder smooth setting, Convergence rate of block-coordinate maximization Burer-Monteiro method for solving large SDPs, Stochastic cluster embedding, Convergence analysis for Kaczmarz-type methods in a Hilbert space framework, Coordinate descent algorithms, Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity, A second-order method for strongly convex \(\ell _1\)-regularization problems, Parallel coordinate descent methods for big data optimization