On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
DOI10.1007/S11075-013-9767-YzbMATH Open1298.49050OpenAlexW2037586972MaRDI QIDQ403098FDOQ403098
Authors: Bilel Kchouk, Jean-Pierre Dussault
Publication date: 29 August 2014
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-013-9767-y
Recommendations
- On the complexity of convergence for high order iterative methods
- On the efficient computation of sparsity patterns for Hessians
- Higher-order efficient class of Chebyshev-Halley type methods
- The Chebyshev accelerating method for progressive iterative approximation
- On the semilocal convergence of efficient Chebyshev-secant-type methods
- scientific article; zbMATH DE number 279515
- Composite convergence bounds based on Chebyshev polynomials and finite precision conjugate gradient computations
- Sparsity in higher order methods for unconstrained optimization
- When integration sparsification fails: banded Galerkin discretizations for Hermite functions, rational Chebyshev functions and sinh-mapped Fourier functions on an infinite domain, and Chebyshev methods for solutions with \(C^\infty\) endpoint singularitie
- Chebyshev weighted norm least-squares spectral methods for the elliptic problem
optimizationalgorithm complexityautomatic differentiationbanded Hessianshigh order Chebyshev methodssparse functions
Analysis of algorithms and problem complexity (68Q25) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Abstract computational complexity for mathematical programming problems (90C60)
Cites Work
- LAPACK Users' Guide
- Title not available (Why is that?)
- Testing Unconstrained Optimization Software
- Computing sparse Hessians with automatic differentiation
- CUTE
- Title not available (Why is that?)
- An unconstrained optimization test functions collection
- Evaluating Derivatives
- The complexity of partial derivatives
- Title not available (Why is that?)
- An acceleration of Newton's method: Super-Halley method
- Efficient computation of sparse hessians using coloring and automatic differentiation
- Estimation of sparse hessian matrices and graph coloring problems
- Title not available (Why is that?)
- On the inverse of band matrices
- Title not available (Why is that?)
- Historical developments in convergence analysis for Newton's and Newton-like methods
- Title not available (Why is that?)
- Title not available (Why is that?)
- The Kantorovich Theorem for Newton's Method
- TAPENADE for C
- Impact of partial separability on large-scale optimization
- On the Estimation of Sparse Hessian Matrices
- Title not available (Why is that?)
- The Chebyshev-Shamanskii method for solving systems of nonlinear equations
- Implementation issues for high-order algorithms
- Symmetric decmposition of positive definite band matrices
- Some bounds on the complexity of gradients, Jacobians, and Hessians
- Sparsity in higher order methods for unconstrained optimization
- On the Halley class of methods for unconstrained optimization problems
- Inverses of Matrices $\{a_{ij}\}$ which Satisfy $a_{ij} = 0$ for $j > i+p$.
- On large-scale unconstrained optimization problems and higher order methods
- Efficient computation of gradients and Jacobians by dynamic exploitation of sparsity in automatic differentiation
- Computing Gradients in Large-Scale Optimization Using Automatic Differentiation
- A Note on Position, Rotation and Scale Invariant Pattern Classification
- High-order Newton-penalty algorithms
- A new family of high-order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods
- Title not available (Why is that?)
- A bibliography on semiseparable matrices
Cited In (2)
Uses Software
This page was built for publication: On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q403098)