The following pages link to Mert Gürbüzbalaban (Q654067):
Displayed 32 items.
- On Nesterov's nonsmooth Chebyshev-Rosenbrock functions (Q654068) (← links)
- Polynomial root radius optimization with affine constraints (Q1675252) (← links)
- A stochastic subgradient method for distributionally robust non-convex and non-smooth learning (Q2159458) (← links)
- Randomness and permutations in coordinate descent methods (Q2189444) (← links)
- Why random reshuffling beats stochastic gradient descent (Q2227529) (← links)
- A globally convergent incremental Newton method (Q2349125) (← links)
- Fast Approximation of the $H_\infty$ Norm via Optimization over Spectral Value Sets (Q2848631) (← links)
- Some Regularity Results for the Pseudospectral Abscissa and Pseudospectral Radius of a Matrix (Q2910872) (← links)
- Approximating the Real Structured Stability Radius with Frobenius-Norm Bounded Perturbations (Q4598332) (← links)
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods (Q4641660) (← links)
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate (Q4641666) (← links)
- (Q5053256) (← links)
- Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Nonconvex Stochastic Optimization: Nonasymptotic Performance Bounds and Momentum-Based Acceleration (Q5058053) (← links)
- Differentially Private Accelerated Optimization Algorithms (Q5080503) (← links)
- Convergence Rate of Incremental Gradient and Incremental Newton Methods (Q5237308) (← links)
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms (Q5266533) (← links)
- Explicit Solutions for Root Optimization of a Polynomial Family With One Affine Constraint (Q5353038) (← links)
- Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions (Q5853717) (← links)
- Exit Time Analysis for Approximations of Gradient Descent Trajectories Around Saddle Points (Q6039758) (← links)
- Robust Accelerated Primal-Dual Methods for Computing Saddle Points (Q6130545) (← links)
- Boundary Conditions for Linear Exit Time Gradient Trajectories Around Saddle Points: Analysis and Algorithm (Q6153686) (← links)
- A Stronger Convergence Result on the Proximal Incremental Aggregated Gradient Method (Q6280102) (← links)
- Decentralized Computation of Effective Resistances and Acceleration of Consensus Algorithms (Q6290449) (← links)
- Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Non-Convex Stochastic Optimization: Non-Asymptotic Performance Bounds and Momentum-Based Acceleration (Q6306642) (← links)
- DAve-QN: A Distributed Averaged Quasi-Newton Method with Local Superlinear Convergence Rate (Q6319821) (← links)
- Randomized Gossiping with Effective Resistance Weights: Performance Guarantees and Applications (Q6322905) (← links)
- On the Heavy-Tailed Theory of Stochastic Gradient Descent for Deep Neural Networks (Q6330127) (← links)
- Decentralized Stochastic Gradient Langevin Dynamics and Hamiltonian Monte Carlo (Q6344224) (← links)
- Entropic Risk-Averse Generalized Momentum Methods (Q6397327) (← links)
- SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems (Q6400555) (← links)
- High Probability and Risk-Averse Guarantees for a Stochastic Accelerated Primal-Dual Method (Q6431722) (← links)
- Robustly Stable Accelerated Momentum Methods With A Near-Optimal L2 Gain and $H_\infty$ Performance (Q6451953) (← links)