Pages that link to "Item:Q4598334"
From MaRDI portal
The following pages link to Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms (Q4598334):
Displaying 16 items.
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods (Q2023684) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Rates of superlinear convergence for classical quasi-Newton methods (Q2149549) (← links)
- Sampling Kaczmarz-Motzkin method for linear feasibility problems: generalization and acceleration (Q2149567) (← links)
- Unifying relations between iterative linear equation solvers and explicit Euler approximations for associated parabolic regularized equations (Q2668183) (← links)
- Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms (Q4598334) (← links)
- Batched Stochastic Gradient Descent with Weighted Sampling (Q4609808) (← links)
- On Adaptive Sketch-and-Project for Solving Linear Systems (Q4997841) (← links)
- Sampled limited memory methods for massive linear inverse problems (Q5000590) (← links)
- Beyond the EM algorithm: constrained optimization methods for latent class model (Q5042122) (← links)
- slimTrain---A Stochastic Approximation Method for Training Separable Deep Neural Networks (Q5095499) (← links)
- Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory (Q5112239) (← links)
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence (Q5853572) (← links)
- Towards explicit superlinear convergence rate for SR1 (Q6038671) (← links)
- An overview of stochastic quasi-Newton methods for large-scale machine learning (Q6097379) (← links)
- Sharp Analysis of Sketch-and-Project Methods via a Connection to Randomized Singular Value Decomposition (Q6202283) (← links)