A Stochastic Line Search Method with Expected Complexity Analysis

From MaRDI portal
Publication:5215517

DOI10.1137/18M1216250zbMath1431.90153arXiv1807.07994OpenAlexW3005484777MaRDI QIDQ5215517

Katya Scheinberg, Courtney Paquette

Publication date: 12 February 2020

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1807.07994




Related Items (35)

Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracyA discussion on variational analysis in derivative-free optimizationA theoretical and empirical comparison of gradient approximations in derivative-free optimizationA Variable Sample-Size Stochastic Quasi-Newton Method for Smooth and Nonsmooth Stochastic Convex OptimizationStochastic Trust-Region Methods with Trust-Region Radius Depending on Probabilistic ModelsGlobal Linear Convergence of Evolution Strategies on More than Smooth Strongly Convex FunctionsParameter calibration in wake effect simulation model with stochastic gradient descent and stratified samplingAn adaptive stochastic sequential quadratic programming with differentiable exact augmented LagrangiansZeroth-order optimization with orthogonal random directionsA stochastic gradient method for a class of nonlinear PDE-constrained optimal control problems under uncertaintyInequality constrained stochastic nonlinear optimization via active-set sequential quadratic programmingA trust region method for noisy unconstrained optimizationDirect Search Based on Probabilistic Descent in Reduced SpacesA simplified convergence theory for Byzantine resilient stochastic gradient descentAdaptive step size rules for stochastic optimization in large-scale learningStochastic regularized Newton methods for nonlinear equationsA line search based proximal stochastic gradient algorithm with dynamical variance reductionConstrained stochastic blackbox optimization using a progressive barrier and probabilistic estimatesTrust-region algorithms: probabilistic complexity and intrinsic noise with applications to subsampling techniquesDiscriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functionsThe impact of noise on evaluation complexity: the deterministic trust-region caseStochastic mesh adaptive direct search for blackbox optimization using probabilistic estimatesAnalysis of the BFGS Method with ErrorsA stochastic subspace approach to gradient-free optimization in high dimensionsAdaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivativesDerivative-free optimization methodsExpected complexity analysis of stochastic direct-searchGlobal Convergence Rate Analysis of a Generic Line Search Algorithm with NoiseAdaptive Regularization Algorithms with Inexact Evaluations for Nonconvex OptimizationLinesearch Newton-CG methods for convex optimization with noiseInexact SARAH algorithm for stochastic optimizationOptimization of Stochastic Blackboxes with Adaptive PrecisionA stochastic first-order trust-region method with inexact restoration for finite-sum minimizationAn inexact restoration-nonsmooth algorithm with variable accuracy for stochastic nonsmooth convex optimization problems in machine learning and stochastic linear complementarity problemsLSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums


Uses Software


Cites Work


This page was built for publication: A Stochastic Line Search Method with Expected Complexity Analysis