Pages that link to "Item:Q2967608"
From MaRDI portal
The following pages link to Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence (Q2967608):
Displaying 23 items.
- Sub-sampled Newton methods (Q1739039) (← links)
- On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization (Q2082285) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections (Q2131141) (← links)
- Functional principal subspace sampling for large scale functional data analysis (Q2137809) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Randomized Newton's method for solving differential equations based on the neural network discretization (Q2161555) (← links)
- Sketch-based empirical natural gradient methods for deep learning (Q2162326) (← links)
- Inexact restoration with subsampled trust-region methods for finite-sum minimization (Q2191786) (← links)
- Random projections for quadratic programs (Q2196316) (← links)
- Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model (Q2233570) (← links)
- A non-Euclidean gradient descent method with sketching for unconstrained matrix minimization (Q2294351) (← links)
- Generalized self-concordant functions: a recipe for Newton-type methods (Q2330645) (← links)
- Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions (Q2693789) (← links)
- Scalable subspace methods for derivative-free nonlinear least-squares optimization (Q6038650) (← links)
- Convergence analysis of a subsampled Levenberg-Marquardt algorithm (Q6047687) (← links)
- M-IHS: an accelerated randomized preconditioning method avoiding costly matrix decompositions (Q6083984) (← links)
- An overview of stochastic quasi-Newton methods for large-scale machine learning (Q6097379) (← links)
- On maximum residual nonlinear Kaczmarz-type algorithms for large nonlinear systems of equations (Q6100585) (← links)
- Generalized linear models for massive data via doubly-sketching (Q6117016) (← links)
- Global optimization using random embeddings (Q6160282) (← links)
- Hessian averaging in stochastic Newton methods achieves superlinear convergence (Q6165593) (← links)
- Riemannian Natural Gradient Methods (Q6189169) (← links)