The following pages link to Martin Takáč (Q263211):
Displaying 44 items.
- Parallel coordinate descent methods for big data optimization (Q263212) (← links)
- On optimal probabilities in stochastic coordinate descent methods (Q315487) (← links)
- Projected semi-stochastic gradient descent method with mini-batch scheme under weak strong convexity assumption (Q1695084) (← links)
- Matrix completion under interval uncertainty (Q1752160) (← links)
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences (Q2044479) (← links)
- Alternating maximization: unifying framework for 8 sparse PCA formulations and efficient parallel codes (Q2129204) (← links)
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (Q2452370) (← links)
- Distributed Coordinate Descent Method for Learning with Big Data (Q2810888) (← links)
- (Q2896264) (← links)
- (Q2953662) (← links)
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions (Q3462314) (← links)
- (Q4366596) (← links)
- (Q4558572) (← links)
- Distributed optimization with arbitrary local solvers (Q4594835) (← links)
- A low-rank coordinate-descent algorithm for semidefinite programming relaxations of optimal power flow (Q4594836) (← links)
- On the complexity of parallel coordinate descent (Q4638927) (← links)
- (Q4969198) (← links)
- A robust multi-batch L-BFGS method for machine learning (Q4972551) (← links)
- Quasi-Newton methods for machine learning: forget the past, just sample (Q5058389) (← links)
- Randomized sketch descent methods for non-separable linearly constrained optimization (Q5077024) (← links)
- Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory (Q5112239) (← links)
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design (Q5176277) (← links)
- New Convergence Aspects of Stochastic Gradient Algorithms (Q5214284) (← links)
- An accelerated communication-efficient primal-dual optimization framework for structured machine learning (Q5859008) (← links)
- Inexact SARAH algorithm for stochastic optimization (Q5859016) (← links)
- Preconditioning meets biased compression for efficient distributed optimization (Q6149587) (← links)
- Decentralized personalized federated learning: lower bounds and optimal algorithm for all personalization modes (Q6170035) (← links)
- Random-reshuffled SARAH does not need full gradient computations (Q6204201) (← links)
- Hybrid Methods in Solving Alternating-Current Optimal Power Flows (Q6266208) (← links)
- A Class of Parallel Doubly Stochastic Algorithms for Large-Scale Learning (Q6274669) (← links)
- A Coordinate-Descent Algorithm for Tracking Solutions in Time-Varying Optimal Power Flows (Q6292778) (← links)
- Dual Free Adaptive Minibatch SDCA for Empirical Risk Minimization (Q6296918) (← links)
- Entropy Penalized Semidefinite Programming (Q6297655) (← links)
- Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample (Q6313197) (← links)
- Inexact Tensor Methods and Their Application to Stochastic Convex Optimization (Q6357301) (← links)
- The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems (Q6389505) (← links)
- FLECS: A Federated Learning Second-Order Framework via Compression and Sketching (Q6401130) (← links)
- Effects of momentum scaling for SGD (Q6414661) (← links)
- A Damped Newton Method Achieves Global $O\left(\frac{1}{k^2}\right)$ and Local Quadratic Convergence Rate (Q6415719) (← links)
- Cubic Regularization is the Key! The First Accelerated Quasi-Newton Method with a Global Convergence Rate of $O(k^{-2})$ for Convex Functions (Q6426003) (← links)
- Hybrid Methods in Polynomial Optimisation (Q6437987) (← links)
- Convergence analysis of stochastic gradient descent with adaptive preconditioning for non-convex and convex functions (Q6448822) (← links)
- Advancing the lower bounds: An accelerated, stochastic, second-order method with optimal adaptation to inexactness (Q6449800) (← links)
- Stochastic Gradient Descent with Preconditioned Polyak Step-size (Q6453637) (← links)