Higher Order Orthogonal Polynomials as Activation Functions in Artificial Neural Networks
DOI10.55630/SJC.2023.17.1-16MaRDI QIDQ6136053FDOQ6136053
Authors:
Publication date: 28 August 2023
Published in: Serdica Journal of Computing (Search for Journal in Brave)
Recommendations
- scientific article; zbMATH DE number 1954154
- Properties and performance of orthogonal neural network in function approximation
- Orthogonal considerations in the design of neural networks for function approximation.
- On function recovery by neural networks based on orthogonal expansions
- A functional equation with polynomial solutions and application to neural networks
- Hölder continuous activation functions in neural networks
- Simultaneous Approximations of Polynomials and Derivatives and Their Applications to Neural Networks
artificial neural networksactivation functionHermite orthogonal polynomialsChebyshev orthogonal polynomials
Computation of special functions and constants, construction of tables (65D20) Artificial neural networks and deep learning (68T07) Orthogonal polynomials and functions of hypergeometric type (Jacobi, Laguerre, Hermite, Askey scheme, etc.) (33C45)
Cites Work
This page was built for publication: Higher Order Orthogonal Polynomials as Activation Functions in Artificial Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136053)