Pages that link to "Item:Q5159400"
From MaRDI portal
The following pages link to An Inertial Newton Algorithm for Deep Learning (Q5159400):
Displaying 20 items.
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping (Q2106043) (← links)
- Asymptotic behavior of Newton-like inertial dynamics involving the sum of potential and nonpotential terms (Q2138450) (← links)
- Newton-type inertial algorithms for solving monotone equations Governed by sums of potential and nonpotential operators (Q2674443) (← links)
- The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems (Q5054738) (← links)
- (Q5074079) (← links)
- Subgradient Sampling for Nonsmooth Nonconvex Minimization (Q6076858) (← links)
- An Improved Unconstrained Approach for Bilevel Optimization (Q6076870) (← links)
- A fast and simple modification of Newton's method avoiding saddle points (Q6086150) (← links)
- Inertial Newton algorithms avoiding strict saddle points (Q6145046) (← links)
- First order inertial optimization algorithms with threshold effects associated with dry friction (Q6146366) (← links)
- Conservative parametric optimality and the ridge method for tame min-max problems (Q6163857) (← links)
- Convergence of inertial dynamics driven by sums of potential and nonpotential operators with implicit Newton-like damping (Q6163952) (← links)
- Fast optimization via inertial dynamics with closed-loop damping (Q6172672) (← links)
- Continuous Newton-like Methods Featuring Inertia and Variable Mass (Q6188502) (← links)
- Nonsmooth nonconvex stochastic heavy ball (Q6536841) (← links)
- Convergence properties of stochastic proximal subgradient method in solving a class of composite optimization problems with cardinality regularizer (Q6536946) (← links)
- A Riemannian dimension-reduced second-order method with application in sensor network localization (Q6562381) (← links)
- Long term dynamics of the subgradient method for Lipschitz path differentiable functions (Q6566415) (← links)
- The backtrack Hölder gradient method with application to min-max and min-min problems (Q6569340) (← links)
- Extrapolated plug-and-play three-operator splitting methods for nonconvex optimization with applications to image restoration (Q6587639) (← links)