Exact and inexact subsampled Newton methods for optimization
From MaRDI portal
Publication:5854329
DOI10.1093/imanum/dry009zbMath1462.65077arXiv1609.08502OpenAlexW2963397933MaRDI QIDQ5854329
Raghu Bollapragada, Nocedal, Jorge, Byrd, Richard H.
Publication date: 16 March 2021
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1609.08502
Stochastic programming (90C15) Methods of quasi-Newton type (90C53) Numerical methods for mathematical programming, optimization and variational techniques (65K99)
Related Items
A fully stochastic second-order trust region method, Quasi-Newton methods for machine learning: forget the past, just sample, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Unnamed Item, Sketched Newton--Raphson, slimTrain---A Stochastic Approximation Method for Training Separable Deep Neural Networks, An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians, Convergence analysis of a subsampled Levenberg-Marquardt algorithm, A trust region method for noisy unconstrained optimization, An adaptive sampling augmented Lagrangian method for stochastic optimization with deterministic constraints, Statistically equivalent surrogate material models: impact of random imperfections on the elasto-plastic response, SVRG meets AdaGrad: painless variance reduction, An overview of stochastic quasi-Newton methods for large-scale machine learning, Inexact restoration with subsampled trust-region methods for finite-sum minimization, On maximum residual nonlinear Kaczmarz-type algorithms for large nonlinear systems of equations, Newton-MR: inexact Newton method with minimum residual sub-problem solver, Generalized linear models for massive data via doubly-sketching, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization, On pseudoinverse-free block maximum residual nonlinear Kaczmarz method for solving large-scale nonlinear system of equations, Adversarial classification via distributional robustness with Wasserstein ambiguity, Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, An investigation of Newton-Sketch and subsampled Newton methods, Unnamed Item, Convergence of Newton-MR under Inexact Hessian Information, Sub-sampled Newton methods, Train Like a (Var)Pro: Efficient Training of Neural Networks with Variable Projection, A robust multi-batch L-BFGS method for machine learning, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Generalized self-concordant functions: a recipe for Newton-type methods, Unnamed Item, Unnamed Item, A hybrid stochastic optimization framework for composite nonconvex optimization, LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums