Automatic differentiation: Reduced gradient and reduced Hessian matrix
From MaRDI portal
Publication:1803653
DOI10.1007/BF00249641zbMath0788.65019OpenAlexW2092910248MaRDI QIDQ1803653
Publication date: 29 June 1993
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00249641
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Nonlinear least squares via automatic derivative evaluation
- A finite algorithm for the exact evaluation of higher order partial derivatives of functions of many variables
- Truncated Newton method for sparse unconstrained optimization using automatic differentiation
- Numerical derivatives and nonlinear analysis
- Fast method to compute the scalar product of gradient and given vector
- Automatic differentiation: techniques and applications
- A generalized Newton algorithm using higher-order derivatives
- Automatic differentiation of characterizing sequences
- Differentiation in PASCAL-SC: type GRADIENT
- Simultaneous computation of functions, partial derivatives and estimates of rounding errors —Complexity and practicality—
- Automatic differentiation of functions of derivatives
- The Arithmetic of Differentiation
This page was built for publication: Automatic differentiation: Reduced gradient and reduced Hessian matrix