From linear to nonlinear iterative methods
From MaRDI portal
Publication:1873166
DOI10.1016/S0168-9274(02)00235-0zbMath1022.65060MaRDI QIDQ1873166
Michael N. Vrahatis, George D. Magoulas, Vassilis P. Plagianakos
Publication date: 19 May 2003
Published in: Applied Numerical Mathematics (Search for Journal in Brave)
algorithms; unconstrained optimization; convergence; numerical examples; successive overrelaxation; systems of nonlinear equations; Jacobi method; nonlinear Gauss-Seidel iteration
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
65H10: Numerical computation of solutions to systems of equations
Related Items
MICE: Multiple‐Peak Identification, Characterization, and Estimation, Improved Newton's method without direct function evaluations, A Schur-Newton-Krylov solver for steady-state aeroelastic analysis and design sensitivity analysis, Improved sign-based learning algorithm derived by the composite nonlinear Jacobi process, Determining the number of real roots of polynomials through neural networks, ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A rapid generalized method of bisection for solving systems of non-linear equations
- Bisection is optimal
- Optimization. Algorithms and consistent approximations
- A new unconstrained optimization method for imprecise function and gradient values
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A Short Proof and a Generalization of Miranda's Existence Theorem
- Algorithm 666: Chabis: a mathematical software package for locating and evaluating roots of systems of nonlinear equations
- Solving systems of nonlinear equations using the nonzero value of the topological degree
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Localization and Computation of Zeros of Bessel Functions
- Locating and Computing Zeros of Airy Functions
- Iterative Solution Methods
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- Convergence Conditions for Ascent Methods
- Minimizing a function without calculating derivatives
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Rates of Convergence for a Class of Iterative Procedures
- Iterative Methods for Solving Partial Difference Equations of Elliptic Type
- Locating and computing in parallel all the simple roots of special functions using PVM