On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
From MaRDI portal
(Redirected from Publication:403098)
Recommendations
- On the complexity of convergence for high order iterative methods
- On the efficient computation of sparsity patterns for Hessians
- Higher-order efficient class of Chebyshev-Halley type methods
- The Chebyshev accelerating method for progressive iterative approximation
- On the semilocal convergence of efficient Chebyshev-secant-type methods
- scientific article; zbMATH DE number 279515
- Composite convergence bounds based on Chebyshev polynomials and finite precision conjugate gradient computations
- Sparsity in higher order methods for unconstrained optimization
- When integration sparsification fails: banded Galerkin discretizations for Hermite functions, rational Chebyshev functions and sinh-mapped Fourier functions on an infinite domain, and Chebyshev methods for solutions with \(C^\infty\) endpoint singularitie
- Chebyshev weighted norm least-squares spectral methods for the elliptic problem
Cites work
- scientific article; zbMATH DE number 4141383 (Why is no real title available?)
- scientific article; zbMATH DE number 3898623 (Why is no real title available?)
- scientific article; zbMATH DE number 3476451 (Why is no real title available?)
- scientific article; zbMATH DE number 1069181 (Why is no real title available?)
- scientific article; zbMATH DE number 3454409 (Why is no real title available?)
- scientific article; zbMATH DE number 3430031 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- scientific article; zbMATH DE number 3198397 (Why is no real title available?)
- scientific article; zbMATH DE number 3086336 (Why is no real title available?)
- A Note on Position, Rotation and Scale Invariant Pattern Classification
- A bibliography on semiseparable matrices
- A new family of high-order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods
- An acceleration of Newton's method: Super-Halley method
- An unconstrained optimization test functions collection
- CUTE
- Computing Gradients in Large-Scale Optimization Using Automatic Differentiation
- Computing sparse Hessians with automatic differentiation
- Efficient computation of gradients and Jacobians by dynamic exploitation of sparsity in automatic differentiation
- Efficient computation of sparse hessians using coloring and automatic differentiation
- Estimation of sparse hessian matrices and graph coloring problems
- Evaluating Derivatives
- High-order Newton-penalty algorithms
- Historical developments in convergence analysis for Newton's and Newton-like methods
- Impact of partial separability on large-scale optimization
- Implementation issues for high-order algorithms
- Inverses of Matrices $\{a_{ij}\}$ which Satisfy $a_{ij} = 0$ for $j > i+p$.
- LAPACK Users' Guide
- On large-scale unconstrained optimization problems and higher order methods
- On the Estimation of Sparse Hessian Matrices
- On the Halley class of methods for unconstrained optimization problems
- On the inverse of band matrices
- Some bounds on the complexity of gradients, Jacobians, and Hessians
- Sparsity in higher order methods for unconstrained optimization
- Symmetric decmposition of positive definite band matrices
- TAPENADE for C
- Testing Unconstrained Optimization Software
- The Chebyshev-Shamanskii method for solving systems of nonlinear equations
- The Kantorovich Theorem for Newton's Method
- The complexity of partial derivatives
Cited in
(2)
Describes a project that uses
Uses Software
This page was built for publication: On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q403098)