Pages that link to "Item:Q1887125"
From MaRDI portal
The following pages link to Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method (Q1887125):
Displaying 16 items.
- An online gradient method with momentum for two-layer feedforward neural networks (Q1026325) (← links)
- Sparse approximations of matrix functions via numerical integration of ODEs (Q1653963) (← links)
- An efficient primal-dual method for the obstacle problem (Q1685519) (← links)
- ASD+M: automatic parameter tuning in stochastic optimization and on-line learning (Q2179079) (← links)
- Dynamic search trajectory methods for global optimization (Q2294590) (← links)
- On parameter acceleration methods for saddle point problems (Q2349550) (← links)
- A secant-based Nesterov method for convex functions (Q2361131) (← links)
- Runge-Kutta-like scaling techniques for first-order methods in convex optimization (Q2400802) (← links)
- Comparison of ARIMA, neural networks and hybrid models in time series: tourist arrival forecasting (Q3432728) (← links)
- A vector regularization method to solve linear inverse problems (Q3465069) (← links)
- Near optimal step size and momentum in gradient descent for quadratic functions (Q4633246) (← links)
- A primal-dual optimization strategy for elliptic partial differential equations (Q5146606) (← links)
- Semistability of Steepest Descent with Momentum for Quadratic Functions (Q5378217) (← links)
- A Study on the Optimal Double Parameters for Steepest Descent with Momentum (Q5380232) (← links)
- Momentum acceleration-based matrix splitting method for solving generalized absolute value equation (Q6056603) (← links)
- Distributed solving linear algebraic equations with switched fractional order dynamics (Q6076824) (← links)