On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
From MaRDI portal
Publication:403098
DOI10.1007/s11075-013-9767-yzbMath1298.49050OpenAlexW2037586972MaRDI QIDQ403098
Bilel Kchouk, Jean-Pierre Dussault
Publication date: 29 August 2014
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-013-9767-y
optimizationautomatic differentiationalgorithm complexitybanded Hessianshigh order Chebyshev methodssparse functions
Analysis of algorithms and problem complexity (68Q25) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High-order Newton-penalty algorithms
- On the inverse of band matrices
- The complexity of partial derivatives
- Impact of partial separability on large-scale optimization
- Historical developments in convergence analysis for Newton's and Newton-like methods
- The Chebyshev-Shamanskii method for solving systems of nonlinear equations
- Implementation issues for high-order algorithms
- Symmetric decmposition of positive definite band matrices
- Sparsity in higher order methods for unconstrained optimization
- Efficient Computation of Sparse Hessians Using Coloring and Automatic Differentiation
- On the Halley class of methods for unconstrainedoptimization problems
- Inverses of Matrices $\{a_{ij}\}$ which Satisfy $a_{ij} = 0$ for $j > i+p$.
- Evaluating Derivatives
- Computing sparse Hessians with automatic differentiation
- On large-scale unconstrained optimization problems and higher order methods
- Estimation of sparse hessian matrices and graph coloring problems
- LAPACK Users' Guide
- On the Estimation of Sparse Hessian Matrices
- Testing Unconstrained Optimization Software
- Efficient computation of gradients and Jacobians by dynamic exploitation of sparsity in automatic differentiation
- Computing Gradients in Large-Scale Optimization Using Automatic Differentiation
- A Note on Position, Rotation and Scale Invariant Pattern Classification
- CUTE
- The Kantorovich Theorem for Newton's Method
- A new family of high-order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods
- A bibliography on semiseparable matrices
- An acceleration of Newton's method: Super-Halley method