Subsampled Hessian Newton Methods for Supervised Learning
From MaRDI portal
Publication:5380307
Recommendations
- Sub-sampled Newton methods
- Subsampled inexact Newton methods for minimizing large sums of convex functions
- Exact and inexact subsampled Newton methods for optimization
- Adaptive iterative Hessian sketch via \(A\)-optimal subsampling
- Stochastic sub-sampled Newton method with variance reduction
- Subsampled nonmonotone spectral gradient methods
- Sublinear optimization for machine learning
- Subgradient and sampling algorithms for \(\ell_1\) regression
- Hessian averaging in stochastic Newton methods achieves superlinear convergence
- Optimal subsampling for softmax regression
Cites work
- scientific article; zbMATH DE number 5359577 (Why is no real title available?)
- scientific article; zbMATH DE number 5430994 (Why is no real title available?)
- scientific article; zbMATH DE number 3579922 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- A finite newton method for classification
- A modified finite Newton method for fast solution of large scale linear SVMs
- A simple automatic derivative evaluation program
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- Iterative scaling and coordinate descent methods for maximum entropy models
- LIBLINEAR: a library for large linear classification
- On the use of stochastic Hessian information in optimization methods for machine learning
- Performance of first-order methods for smooth convex minimization: a novel approach
- Some methods of speeding up the convergence of iteration methods
- Trust Region Methods
- Trust region Newton method for logistic regression
Cited in
(10)- Exact and inexact subsampled Newton methods for optimization
- On the inversion-free Newton's method and its applications
- Approximate Newton methods
- Adaptive iterative Hessian sketch via \(A\)-optimal subsampling
- Nesterov's acceleration for approximate Newton
- On the use of stochastic Hessian information in optimization methods for machine learning
- Distributed Newton Methods for Deep Neural Networks
- Stochastic sub-sampled Newton method with variance reduction
- Sub-sampled Newton methods
- Hessian averaging in stochastic Newton methods achieves superlinear convergence
This page was built for publication: Subsampled Hessian Newton Methods for Supervised Learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380307)