Coordinate descent algorithms
From MaRDI portal
Publication:2349114
DOI10.1007/s10107-015-0892-3zbMath1317.49038arXiv1502.04759OpenAlexW2000769684MaRDI QIDQ2349114
Publication date: 19 June 2015
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1502.04759
Related Items
An Optimal Scheduled Learning Rate for a Randomized Kaczmarz Algorithm, Forecasting Multiple Time Series With One-Sided Dynamic Principal Components, A Fast Block Coordinate Descent Method for Solving Linear Least-Squares Problems, Exploiting Problem Structure in Derivative Free Optimization, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, Optimal regularizations for data generation with probabilistic graphical models, slimTrain---A Stochastic Approximation Method for Training Separable Deep Neural Networks, RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression, Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications, Scalable subspace methods for derivative-free nonlinear least-squares optimization, Zeroth-order optimization with orthogonal random directions, A unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descent, Sparse random feature maps for the item-multiset kernel, Weighted Bayesian bootstrap for scalable posterior distributions, Cyclic Coordinate Dual Averaging with Extrapolation, Non-smooth setting of stochastic decentralized convex optimization problem over time-varying graphs, On adaptive block coordinate descent methods for ridge regression, A General Framework of Nonparametric Feature Selection in High-Dimensional Data, Unified analysis of stochastic gradient methods for composite convex and smooth optimization, A variable projection method for large-scale inverse problems with \(\ell^1\) regularization, Parameter estimation in a 3‐parameter p‐star random graph model, Management of the curb space allocation in urban transportation system, Block Policy Mirror Descent, Derivation of coordinate descent algorithms from optimal control theory, Risk-averse optimization of reward-based coherent risk measures, Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices, Scalable and efficient inference via CPE, Fast deflation sparse principal component analysis via subspace projections, Generalized Sparse Bayesian Learning and Application to Image Reconstruction, On the Efficiency of Random Permutation for ADMM and Coordinate Descent, Sorted \(L_1/L_2\) minimization for sparse signal recovery, A novel method for optimizing spectral rotation embedding \(K\)-means with coordinate descent, Analysis of the Block Coordinate Descent Method for Linear Ill-Posed Problems, Conjugate gradients acceleration of coordinate descent for linear systems, Multi-view \(K\)-means clustering algorithm based on redundant and sparse feature learning, Quantifying the benefits of customized vaccination strategies: A network‐based optimization approach, Integrated optimization algorithm: a metaheuristic approach for complicated optimization, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Global optimization using random embeddings, Inducing sparsity via the horseshoe prior in imaging problems, Reduced-order variational mode decomposition to reveal transient and non-stationary dynamics in fluid flows, Robust supervised learning with coordinate gradient descent, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Scaled proximal gradient methods for sparse optimization problems, Prevention in two‐period time and its extension health risk model, A bicomposition of conical projections, Partial correlation graphical LASSO, Utility/privacy trade-off as regularized optimal transport, GAPs for Shallow Implementation of Quantum Finite Automata, Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming, Avoiding Communication in Primal and Dual Block Coordinate Descent Methods, Blind Nonnegative Source Separation Using Biological Neural Networks, Adaptive Catalyst for Smooth Convex Optimization, An investigation of Newton-Sketch and subsampled Newton methods, Interpolatory Methods for $$\mathcal{H}_{\infty }$$ Model Reduction of Multi-Input/Multi-Output Systems, On the complexity of parallel coordinate descent, Decomposition Methods for Computing Directional Stationary Solutions of a Class of Nonsmooth Nonconvex Optimization Problems, A Distributed Framework for the Construction of Transport Maps, Unnamed Item, ADDRESSING IMBALANCED INSURANCE DATA THROUGH ZERO-INFLATED POISSON REGRESSION WITH BOOSTING, Random Batch Algorithms for Quantum Monte Carlo Simulations, Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version, Unnamed Item, Randomized Gradient Boosting Machine, RECENT ADVANCES IN DOMAIN DECOMPOSITION METHODS FOR TOTAL VARIATION MINIMIZATION, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme, Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization, An alternating minimization method for robust principal component analysis, Sparse additive machine with ramp loss, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Incremental CP Tensor Decomposition by Alternating Minimization Method, Unnamed Item, Splitting proximal with penalization schemes for additive convex hierarchical minimization problems, Regularized Kaczmarz Algorithms for Tensor Recovery, A generic coordinate descent solver for non-smooth convex optimisation, Distributed Stochastic Optimization with Large Delays, Signal Decomposition Using Masked Proximal Operators, Proximal Gradient Methods with Adaptive Subspace Sampling, Control analysis and design via randomised coordinate polynomial minimisation, Proximal Gradient Methods for Machine Learning and Imaging, A block coordinate descent method for sensor network localization, Unsupervised learning of pharmacokinetic responses, Off-diagonal symmetric nonnegative matrix factorization, Managing randomization in the multi-block alternating direction method of multipliers for quadratic optimization, Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection, On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems, Weighted rank estimation for nonparametric transformation models with nonignorable missing data, On obtaining sparse semantic solutions for inverse problems, control, and neural network training, Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, Inference for time-varying signals using locally stationary processes, Simultaneous reconstruction and segmentation with the Mumford-Shah functional for electron tomography, Revealing hidden dynamics from time-series data by ODENet, An accelerated coordinate gradient descent algorithm for non-separable composite optimization, Visualizing the effects of a changing distance on data using continuous embeddings, Block coordinate descent for smooth nonconvex constrained minimization, Classes of linear programs solvable by coordinate-wise minimization, Optimal designs for dose-response models with linear effects of covariates, Block layer decomposition schemes for training deep neural networks, An approximation method of CP rank for third-order tensor completion, A unified approach to error bounds for structured convex optimization problems, On computing the distance to stability for matrices using linear dissipative Hamiltonian systems, AIR tools II: algebraic iterative reconstruction methods, improved implementation, Cost-to-travel functions: a new perspective on optimal and model predictive control, Linear convergence of first order methods for non-strongly convex optimization, Adaptive block coordinate DIRECT algorithm, Accelerating block coordinate descent methods with identification strategies, An almost cyclic 2-coordinate descent method for singly linearly constrained problems, Convergence rates of accelerated proximal gradient algorithms under independent noise, Randomness and permutations in coordinate descent methods, An unexpected connection between Bayes \(A\)-optimal designs and the group Lasso, Robust multicategory support matrix machines, Distributed nonconvex constrained optimization over time-varying digraphs, Near-linear convergence of the random Osborne algorithm for matrix balancing, Testing and non-linear preconditioning of the proximal point method, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, A safe reinforced feature screening strategy for Lasso based on feasible solutions, Bregman Itoh-Abe methods for sparse optimisation, Asynchronous parallel algorithms for nonconvex optimization, Efficient first-order methods for convex minimization: a constructive approach, Optimization of the first Dirichlet Laplacian eigenvalue with respect to a union of balls, Use of projective coordinate descent in the Fekete problem, On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes, Compressive sensing adaptation for polynomial chaos expansions, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Primal-dual block-proximal splitting for a class of non-convex problems, Developing integer calibration weights for census of agriculture, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Random batch methods (RBM) for interacting particle systems, Coercing machine learning to output physically accurate results, Bregman reweighted alternating minimization and its application to image deblurring, Column-oriented algebraic iterative methods for nonnegative constrained least squares problems, On convergence rate of the randomized Gauss-Seidel method, Shape Analysis of Functional Data, A Randomized Coordinate Descent Method with Volume Sampling, A distributed asynchronous method of multipliers for constrained nonconvex optimization, Optimized packing multidimensional hyperspheres: a unified approach, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization, Block coordinate descent energy minimization for dynamic cohesive fracture, Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex, Asynchronous Lagrangian scenario decomposition, Random reordering in SOR-type methods, A geometric probability randomized Kaczmarz method for large scale linear systems, Mixed-level column augmented uniform designs, On shortest Dubins path via a circular boundary, Unnamed Item, Low-rank matrix approximation in the infinity norm, Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods, Acceleration of primal-dual methods by preconditioning and simple subproblem procedures, Shortest Dubins paths through three points, Gesture-driven LEGO robots, Algorithms for positive semidefinite factorization, Quadratically regularized optimal transport, A stochastic subspace approach to gradient-free optimization in high dimensions, Partially distributed outer approximation, Imputation of clinical covariates in time series, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, Iterative Proportional Scaling Revisited: A Modern Optimization Perspective, Maximum likelihood estimation for non-minimum-phase noise transfer function with Gaussian mixture noise distribution, Local search with groups of step sizes, Non-parametric learning of lifted restricted Boltzmann machines, Gauss-Seidel method with oblique direction, Faster algorithms for min-max-min robustness for combinatorial problems with budgeted uncertainty, Coordinate descent with arbitrary sampling I: algorithms and complexity†, Coordinate descent with arbitrary sampling II: expected separable overapproximation, Linear support vector regression with linear constraints, Two-level monotonic multistage recommender systems, Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions, An inertial parallel and asynchronous forward-backward iteration for distributed convex optimization, Block-proximal methods with spatially adapted acceleration, Parallel subgradient algorithm with block dual decomposition for large-scale optimization, Dualize, split, randomize: toward fast nonsmooth optimization algorithms, Risk-averse policy optimization via risk-neutral policy optimization, Curiosities and counterexamples in smooth convex optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization, High-performance statistical computing in the computing environments of the 2020s, On the convergence of a block-coordinate incremental gradient method, Performance analysis of greedy algorithms for minimising a maximum mean discrepancy, Four algorithms to solve symmetric multi-type non-negative matrix tri-factorization problem, On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces
Uses Software
Cites Work
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Sparse inverse covariance estimation with the graphical lasso
- Parallel coordinate descent methods for big data optimization
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A randomized Kaczmarz algorithm with exponential convergence
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Convex Analysis
- On search directions for minimization algorithms
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item