Parameter redundancy in neural networks: an application of Chebyshev polynomials
From MaRDI portal
Publication:2468329
DOI10.1007/s10287-006-0009-9zbMath1137.82017OpenAlexW1973639189MaRDI QIDQ2468329
Publication date: 22 January 2008
Published in: Computational Management Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10287-006-0009-9
neural networkChebyshev polynomialparameter redundancymulti layer perceptron neural network (MLP NN) architecture
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some remarks on multivariate Chebyshev polynomials
- Testing for neglected nonlinearity in time series models. A comparison of neural network methods and alternative tests
- Multilayer feedforward networks are universal approximators
- Model selection in neural networks: some difficulties
- The Econometric Analysis of Seasonal Time Series
- Universal approximation bounds for superpositions of a sigmoidal function
- A Comparison of “Best” Polynomial Approximations with Truncated Chebyshev Series Expansions
This page was built for publication: Parameter redundancy in neural networks: an application of Chebyshev polynomials