Pages that link to "Item:Q4580706"
From MaRDI portal
The following pages link to Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization (Q4580706):
Displaying 10 items.
- A flexible coordinate descent method (Q1639710) (← links)
- Asynchronous parallel algorithms for nonconvex optimization (Q2205974) (← links)
- A primal Douglas-Rachford splitting method for the constrained minimization problem in compressive sensing (Q2405990) (← links)
- A framework for parallel second order incremental optimization algorithms for solving partially separable problems (Q2419531) (← links)
- A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares (Q2796799) (← links)
- Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity (Q5000647) (← links)
- Decentralized Dictionary Learning Over Time-Varying Digraphs (Q5214231) (← links)
- Newton-like Method with Diagonal Correction for Distributed Optimization (Q5275293) (← links)
- \(\mathrm{L_1RIP}\)-based robust compressed sensing (Q6042575) (← links)
- A stochastic averaging gradient algorithm with multi‐step communication for distributed optimization (Q6054701) (← links)