The following pages link to Sub-sampled Newton methods (Q1739039):
Displaying 48 items.
- An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity (Q2020598) (← links)
- On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization (Q2082285) (← links)
- Linesearch Newton-CG methods for convex optimization with noise (Q2084588) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Subsampled nonmonotone spectral gradient methods (Q2178981) (← links)
- Inexact restoration with subsampled trust-region methods for finite-sum minimization (Q2191786) (← links)
- Newton-type methods for non-convex optimization under inexact Hessian information (Q2205970) (← links)
- Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization (Q2302838) (← links)
- Generalized self-concordant functions: a recipe for Newton-type methods (Q2330645) (← links)
- Statistically equivalent surrogate material models: impact of random imperfections on the elasto-plastic response (Q2679290) (← links)
- Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions (Q2693789) (← links)
- Stable architectures for deep neural networks (Q4607800) (← links)
- Randomized Approach to Nonlinear Inversion Combining Random and Optimized Simultaneous Sources and Detectors (Q4631407) (← links)
- (Q4633014) (← links)
- (Q4633051) (← links)
- (Q4633055) (← links)
- (Q4633059) (← links)
- Optimization Methods for Large-Scale Machine Learning (Q4641709) (← links)
- (Q4998966) (← links)
- Quasi-Newton methods for machine learning: forget the past, just sample (Q5058389) (← links)
- Sketched Newton--Raphson (Q5093644) (← links)
- An investigation of Newton-Sketch and subsampled Newton methods (Q5135249) (← links)
- Convergence of Newton-MR under Inexact Hessian Information (Q5148404) (← links)
- (Q5159403) (← links)
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization (Q5244400) (← links)
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization (Q5244401) (← links)
- Scalable subspace methods for derivative-free nonlinear least-squares optimization (Q6038650) (← links)
- An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians (Q6038658) (← links)
- Convergence analysis of a subsampled Levenberg-Marquardt algorithm (Q6047687) (← links)
- SCORE: approximating curvature information under self-concordant regularization (Q6051307) (← links)
- An adaptive sampling augmented Lagrangian method for stochastic optimization with deterministic constraints (Q6072951) (← links)
- An overview of stochastic quasi-Newton methods for large-scale machine learning (Q6097379) (← links)
- Newton-MR: inexact Newton method with minimum residual sub-problem solver (Q6114941) (← links)
- Generalized linear models for massive data via doubly-sketching (Q6117016) (← links)
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization (Q6134379) (← links)
- Global optimization using random embeddings (Q6160282) (← links)
- Hessian averaging in stochastic Newton methods achieves superlinear convergence (Q6165593) (← links)
- Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization (Q6175706) (← links)
- Differentially private inference via noisy optimization (Q6183772) (← links)
- Riemannian Natural Gradient Methods (Q6189169) (← links)
- An adaptive covariance parameterization technique for the ensemble Gaussian mixture filter (Q6562378) (← links)
- Random projections for linear programming: an improved retrieval phase (Q6579780) (← links)
- Subsampled first-order optimization methods with applications in imaging (Q6606441) (← links)
- On the inversion-free Newton's method and its applications (Q6612368) (← links)
- A multilevel method for self-concordant minimization (Q6655798) (← links)
- SketchySGD: reliable stochastic optimization via randomized curvature estimates (Q6664471) (← links)
- Trust region-type method under inexact gradient and inexact Hessian with convergence analysis (Q6665208) (← links)