Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent

From MaRDI portal
Publication:3150014

DOI10.1162/08997660260028683zbMath1037.68119OpenAlexW2130984546WikidataQ52038024 ScholiaQ52038024MaRDI QIDQ3150014

Nicol N. Schraudolph

Publication date: 2002

Published in: Neural Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1162/08997660260028683




Related Items (21)

Efficient Calculation of the Gauss-Newton Approximation of the Hessian Matrix in Neural NetworksA survey on deep learning and its applicationsSpeeding up the scaled conjugate gradient algorithm and its application in neuro-fuzzy classifier trainingLimited Stochastic Meta-Descent for Kernel-Based Online LearningAccelerated Methods for NonConvex OptimizationSCORE: approximating curvature information under self-concordant regularizationA distributed optimisation framework combining natural gradient with Hessian-free for discriminative sequence trainingAn overview of stochastic quasi-Newton methods for large-scale machine learningEfficient Natural Gradient Descent Methods for Large-Scale PDE-Based Optimization ProblemsA \(J\)-symmetric quasi-Newton method for minimax problemsEfficient approximations of the fisher matrix in neural networks using kronecker product singular value decompositionFirst-Order Methods for Nonconvex Quadratic MinimizationRiemannian Natural Gradient MethodsSubsampled Hessian Newton Methods for Supervised LearningOptimization for deep learning: an overviewDistributed Newton Methods for Deep Neural NetworksA matrix-free line-search algorithm for nonconvex optimizationA survey on learning approaches for undirected graphical models. Application to scene object recognitionGradient Descent Finds the Cubic-Regularized Nonconvex Newton StepA stochastic conjugate gradient method for the approximation of functionsUnnamed Item



Cites Work


This page was built for publication: Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent