Martin Takáč

From MaRDI portal
Revision as of 09:58, 7 October 2023 by Import231006081045 (talk | contribs) (Created automatically from import231006081045)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Person:263211

Available identifiers

zbMath Open takac.martinMaRDI QIDQ263211

List of research outcomes





PublicationDate of PublicationType
Inexact tensor methods and their application to stochastic convex optimization2024-08-12Paper
Stochastic gradient methods with preconditioned updates2024-05-14Paper
Random-reshuffled SARAH does not need full gradient computations2024-03-27Paper
Preconditioning meets biased compression for efficient distributed optimization2024-02-06Paper
Stochastic Gradient Descent with Preconditioned Polyak Step-size2023-10-03Paper
Advancing the lower bounds: An accelerated, stochastic, second-order method with optimal adaptation to inexactness2023-09-04Paper
Convergence analysis of stochastic gradient descent with adaptive preconditioning for non-convex and convex functions2023-08-27Paper
Decentralized personalized federated learning: lower bounds and optimal algorithm for all personalization modes2023-07-12Paper
Hybrid Methods in Polynomial Optimisation2023-05-25Paper
Cubic Regularization is the Key! The First Accelerated Quasi-Newton Method with a Global Convergence Rate of $O(k^{-2})$ for Convex Functions2023-02-09Paper
Quasi-Newton methods for machine learning: forget the past, just sample2022-12-20Paper
A Damped Newton Method Achieves Global $O\left(\frac{1}{k^2}\right)$ and Local Quadratic Convergence Rate2022-10-31Paper
Effects of momentum scaling for SGD2022-10-21Paper
FLECS: A Federated Learning Second-Order Framework via Compression and Sketching2022-06-04Paper
Randomized sketch descent methods for non-separable linearly constrained optimization2022-05-17Paper
Alternating maximization: unifying framework for 8 sparse PCA formulations and efficient parallel codes2022-04-22Paper
The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems2022-01-28Paper
Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences2021-08-09Paper
An accelerated communication-efficient primal-dual optimization framework for structured machine learning2021-04-15Paper
Inexact SARAH algorithm for stochastic optimization2021-04-15Paper
Inexact Tensor Methods and Their Application to Stochastic Convex Optimization2020-12-31Paper
https://portal.mardi4nfdi.de/entity/Q49691982020-10-05Paper
Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory2020-05-28Paper
New Convergence Aspects of Stochastic Gradient Algorithms2020-02-07Paper
A robust multi-batch L-BFGS method for machine learning2019-11-25Paper
Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample2019-01-28Paper
https://portal.mardi4nfdi.de/entity/Q45585722018-11-22Paper
Matrix completion under interval uncertainty2018-05-24Paper
On the complexity of parallel coordinate descent2018-05-02Paper
Entropy Penalized Semidefinite Programming2018-02-12Paper
Projected semi-stochastic gradient descent method with mini-batch scheme under weak strong convexity assumption2018-02-06Paper
Dual Free Adaptive Minibatch SDCA for Empirical Risk Minimization2018-01-25Paper
Distributed optimization with arbitrary local solvers2017-11-24Paper
A low-rank coordinate-descent algorithm for semidefinite programming relaxations of optimal power flow2017-11-24Paper
A Coordinate-Descent Algorithm for Tracking Solutions in Time-Varying Optimal Power Flows2017-10-19Paper
https://portal.mardi4nfdi.de/entity/Q29536622017-01-05Paper
On optimal probabilities in stochastic coordinate descent methods2016-09-21Paper
A Class of Parallel Doubly Stochastic Algorithms for Large-Scale Learning2016-06-15Paper
Distributed Coordinate Descent Method for Learning with Big Data2016-06-06Paper
Parallel coordinate descent methods for big data optimization2016-04-04Paper
Distributed Block Coordinate Descent for Minimizing Partially Separable Functions2016-01-05Paper
Hybrid Methods in Solving Alternating-Current Optimal Power Flows2015-10-07Paper
Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design2015-03-03Paper
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function2014-06-02Paper
https://portal.mardi4nfdi.de/entity/Q28962642012-07-16Paper
https://portal.mardi4nfdi.de/entity/Q43665961998-04-03Paper
Exploiting higher-order derivatives in convex optimization methodsN/APaper

Research outcomes over time

This page was built for person: Martin Takáč