Pages that link to "Item:Q1970409"
From MaRDI portal
The following pages link to A class of gradient unconstrained minimization algorithms with adaptive stepsize (Q1970409):
Displaying 36 items.
- A new nonmonotone spectral residual method for nonsmooth nonlinear equations (Q344250) (← links)
- Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method (Q508045) (← links)
- Modified nonmonotone Armijo line search for descent method (Q535246) (← links)
- A gradient-related algorithm with inexact line searches (Q596214) (← links)
- Convergence analysis of a modified BFGS method on convex minimizations (Q711385) (← links)
- Convergence of descent method with new line search (Q815995) (← links)
- Convergence of quasi-Newton method with new inexact line search (Q819030) (← links)
- Studying the performance of artificial neural networks on problems related to cryptography (Q867941) (← links)
- Discrete tomography with unknown intensity levels using higher-order statistics (Q892823) (← links)
- Multivariate spectral gradient method for unconstrained optimization (Q945298) (← links)
- A descent algorithm without line search for unconstrained optimization (Q1044422) (← links)
- Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery (Q1701931) (← links)
- A new descent algorithm with curve search rule (Q1764727) (← links)
- From linear to nonlinear iterative methods (Q1873166) (← links)
- Convergence of line search methods for unconstrained optimization (Q1881700) (← links)
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions (Q2163462) (← links)
- Global convergence of a modified Broyden family method for nonconvex functions (Q2171116) (← links)
- A survey of gradient methods for solving nonlinear optimization (Q2220680) (← links)
- New stepsizes for the gradient method (Q2228379) (← links)
- A modified nonmonotone BFGS algorithm for unconstrained optimization (Q2400759) (← links)
- Assessing the effectiveness of artificial neural networks on problems related to elliptic curve cryptography (Q2470209) (← links)
- On memory gradient method with trust region for unconstrained optimization (Q2492798) (← links)
- Convergence of nonmonotone line search method (Q2493941) (← links)
- A new gradient method with an optimal stepsize property (Q2506170) (← links)
- Determining the number of real roots of polynomials through neural networks (Q2507014) (← links)
- New line search methods for unconstrained optimization (Q2510603) (← links)
- Convergence of descent method without line search (Q2570691) (← links)
- A new super-memory gradient method with curve search rule (Q2571993) (← links)
- Iterative parameter estimation algorithms for dual-frequency signal models (Q2633182) (← links)
- STUDYING THE BASIN OF CONVERGENCE OF METHODS FOR COMPUTING PERIODIC ORBITS (Q3165821) (← links)
- A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems (Q3462304) (← links)
- ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH (Q3598837) (← links)
- Non Monotone Backtracking Inexact BFGS Method for Regression Analysis (Q5299071) (← links)
- Accelerated multiple step-size methods for solving unconstrained optimization problems (Q5865330) (← links)
- Artificial nonmonotonic neural networks (Q5958708) (← links)
- Accelerated gradient descent methods with line search (Q5961879) (← links)