Pages that link to "Item:Q2788401"
From MaRDI portal
The following pages link to Global Convergence of Online Limited Memory BFGS (Q2788401):
Displaying 27 items.
- A Stochastic Quasi-Newton Method for Large-Scale Optimization (Q121136) (← links)
- Spectral projected gradient method for stochastic optimization (Q670658) (← links)
- An adaptive Hessian approximated stochastic gradient MCMC method (Q2128489) (← links)
- Limited-memory BFGS with displacement aggregation (Q2149548) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition (Q2696921) (← links)
- (Q4633012) (← links)
- (Q4969174) (← links)
- (Q4969198) (← links)
- A robust multi-batch L-BFGS method for machine learning (Q4972551) (← links)
- Sampled Tikhonov regularization for large linear inverse problems (Q4973539) (← links)
- Solving generalized inverse eigenvalue problems via L-BFGS-B method (Q4991546) (← links)
- Sampled limited memory methods for massive linear inverse problems (Q5000590) (← links)
- Quasi-Newton methods for machine learning: forget the past, just sample (Q5058389) (← links)
- QNG: A Quasi-Natural Gradient Method for Large-Scale Statistical Learning (Q5067429) (← links)
- A Variable Sample-Size Stochastic Quasi-Newton Method for Smooth and Nonsmooth Stochastic Convex Optimization (Q5076721) (← links)
- On Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate Analysis (Q5107212) (← links)
- (Q5159435) (← links)
- An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration (Q5231671) (← links)
- Newton-like Method with Diagonal Correction for Distributed Optimization (Q5275293) (← links)
- Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning (Q5381126) (← links)
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization (Q5737735) (← links)
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate (Q5745078) (← links)
- LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums (Q5879118) (← links)
- An overview of stochastic quasi-Newton methods for large-scale machine learning (Q6097379) (← links)
- A non-monotone trust-region method with noisy oracles and additional sampling (Q6606856) (← links)
- On the inversion-free Newton's method and its applications (Q6612368) (← links)