Pages that link to "Item:Q2910875"
From MaRDI portal
The following pages link to Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems (Q2910875):
Displaying 50 items.
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- A stochastic homotopy tracking algorithm for parametric systems of nonlinear equations (Q2031864) (← links)
- On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems (Q2041935) (← links)
- A stochastic subspace approach to gradient-free optimization in high dimensions (Q2044475) (← links)
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences (Q2044479) (← links)
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization (Q2044495) (← links)
- Fastest rates for stochastic mirror descent methods (Q2044496) (← links)
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization (Q2046565) (← links)
- Sparse group fused Lasso for model segmentation: a hybrid approach (Q2051576) (← links)
- An integrated stochastic model and algorithm for constrained multi-item newsvendor problems by two-stage decision-making approach (Q2060285) (← links)
- On the convergence of a randomized block coordinate descent algorithm for a matrix least squares problem (Q2060942) (← links)
- Asynchronous networked aggregative games (Q2063821) (← links)
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems (Q2070400) (← links)
- Linear support vector regression with linear constraints (Q2071335) (← links)
- Levenberg-Marquardt method based on probabilistic Jacobian models for nonlinear equations (Q2082542) (← links)
- Cyclic coordinate descent in the Hölder smooth setting (Q2084003) (← links)
- On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization (Q2089862) (← links)
- On the convergence of a block-coordinate incremental gradient method (Q2100401) (← links)
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity (Q2115253) (← links)
- Accelerated proximal envelopes: application to componentwise methods (Q2116598) (← links)
- On the computational efficiency of catalyst accelerated coordinate descent (Q2117631) (← links)
- On obtaining sparse semantic solutions for inverse problems, control, and neural network training (Q2132578) (← links)
- Using neural networks to accelerate the solution of the Boltzmann equation (Q2132591) (← links)
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems (Q2133414) (← links)
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (Q2133415) (← links)
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization (Q2139254) (← links)
- Oracle complexity separation in convex optimization (Q2139268) (← links)
- Phase-only transmit beampattern design for large phased array antennas with multi-point nulling (Q2146523) (← links)
- Sampling Kaczmarz-Motzkin method for linear feasibility problems: generalization and acceleration (Q2149567) (← links)
- Extended randomized Kaczmarz method for sparse least squares and impulsive noise problems (Q2168919) (← links)
- Block layer decomposition schemes for training deep neural networks (Q2173515) (← links)
- Accelerated sampling Kaczmarz Motzkin algorithm for the linear feasibility problem (Q2176283) (← links)
- Randomness and permutations in coordinate descent methods (Q2189444) (← links)
- On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems (Q2192631) (← links)
- Synchronous parallel block coordinate descent method for nonsmooth convex function minimization (Q2200102) (← links)
- Lower bounds for finding stationary points I (Q2205972) (← links)
- Efficient first-order methods for convex minimization: a constructive approach (Q2205976) (← links)
- Emergence of price-taking behavior (Q2206009) (← links)
- Optimization for deep learning: an overview (Q2218095) (← links)
- Primal-dual block-proximal splitting for a class of non-convex problems (Q2218923) (← links)
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version (Q2220668) (← links)
- Random batch methods (RBM) for interacting particle systems (Q2222655) (← links)
- On convergence rate of the randomized Gauss-Seidel method (Q2228499) (← links)
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup (Q2235160) (← links)
- Asynchronous Lagrangian scenario decomposition (Q2246185) (← links)
- Accelerated directional search with non-Euclidean prox-structure (Q2290400) (← links)
- Markov chain block coordinate descent (Q2301127) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Variant of greedy randomized Kaczmarz for ridge regression (Q2311776) (← links)
- Randomized primal-dual proximal block coordinate updates (Q2314059) (← links)