A class of gradient unconstrained minimization algorithms with adaptive stepsize
From MaRDI portal
Publication:1970409
DOI10.1016/S0377-0427(99)00276-9zbMath0958.65072WikidataQ126298369 ScholiaQ126298369MaRDI QIDQ1970409
G. S. Androulakis, Michael N. Vrahatis, J. N. Lambrinos, George D. Magoulas
Publication date: 10 April 2001
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Related Items (36)
Convergence of quasi-Newton method with new inexact line search ⋮ From linear to nonlinear iterative methods ⋮ Convergence of line search methods for unconstrained optimization ⋮ STUDYING THE BASIN OF CONVERGENCE OF METHODS FOR COMPUTING PERIODIC ORBITS ⋮ Iterative parameter estimation algorithms for dual-frequency signal models ⋮ Non Monotone Backtracking Inexact BFGS Method for Regression Analysis ⋮ A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems ⋮ A new nonmonotone spectral residual method for nonsmooth nonlinear equations ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ Studying the performance of artificial neural networks on problems related to cryptography ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ A gradient-related algorithm with inexact line searches ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ Discrete tomography with unknown intensity levels using higher-order statistics ⋮ Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ New stepsizes for the gradient method ⋮ Multivariate spectral gradient method for unconstrained optimization ⋮ Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method ⋮ Assessing the effectiveness of artificial neural networks on problems related to elliptic curve cryptography ⋮ Modified nonmonotone Armijo line search for descent method ⋮ A new descent algorithm with curve search rule ⋮ On memory gradient method with trust region for unconstrained optimization ⋮ Convergence of nonmonotone line search method ⋮ Convergence analysis of a modified BFGS method on convex minimizations ⋮ ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH ⋮ A new gradient method with an optimal stepsize property ⋮ Determining the number of real roots of polynomials through neural networks ⋮ Artificial nonmonotonic neural networks ⋮ New line search methods for unconstrained optimization ⋮ Accelerated gradient descent methods with line search ⋮ A descent algorithm without line search for unconstrained optimization ⋮ Convergence of descent method without line search ⋮ A new super-memory gradient method with curve search rule ⋮ Accelerated multiple step-size methods for solving unconstrained optimization problems ⋮ Convergence of descent method with new line search
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Cauchy's method of minimization
- Nonlinear successive over-relaxation
- Optimization. Algorithms and consistent approximations
- A new unconstrained optimization method for imprecise function and gradient values
- A dimension-reducing method for unconstrained optimization
- OPTAC: A portable software package for analyzing and comparing optimization methods by visualization
- An effective algorithm for minimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Solution of Partial Differential Equations on Vector and Parallel Computers
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Survey of Parallel Algorithms in Numerical Linear Algebra
- On the acceleration of the backpropagation training method
- Iterative Solution Methods
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On Steepest Descent
- Rates of Convergence for a Class of Iterative Procedures
- Iterative Methods for Solving Partial Difference Equations of Elliptic Type
- The method of steepest descent for non-linear minimization problems
This page was built for publication: A class of gradient unconstrained minimization algorithms with adaptive stepsize