Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
DOI10.1109/TSP.2015.2436357zbMATH Open1394.94146OpenAlexW1580723439MaRDI QIDQ4580706FDOQ4580706
Authors: Amir Daneshmand, Francisco Facchinei, Vyacheslav Kungurtsev, Gesualdo Scutari
Publication date: 22 August 2018
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tsp.2015.2436357
Convex programming (90C25) Parallel numerical computation (65Y05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Nonconvex programming, global optimization (90C26)
Cited In (11)
- Parallel coordinate descent methods for big data optimization
- \(\mathrm{L_1RIP}\)-based robust compressed sensing
- Newton-like method with diagonal correction for distributed optimization
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- A flexible coordinate descent method
- Asynchronous parallel algorithms for nonconvex optimization
- A primal Douglas-Rachford splitting method for the constrained minimization problem in compressive sensing
- Decentralized dictionary learning over time-varying digraphs
- A stochastic averaging gradient algorithm with multi‐step communication for distributed optimization
- Ghost penalties in nonconvex constrained optimization: diminishing stepsizes and iteration complexity
- A framework for parallel second order incremental optimization algorithms for solving partially separable problems
This page was built for publication: Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4580706)