Parameter redundancy in neural networks: an application of Chebyshev polynomials (Q2468329)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Parameter redundancy in neural networks: an application of Chebyshev polynomials |
scientific article |
Statements
Parameter redundancy in neural networks: an application of Chebyshev polynomials (English)
0 references
22 January 2008
0 references
The paper considers MLP NN architectures with one single layer. The sigmoid activation function at each hidden node is approximated by Chebyshev polynomials and the result is subsequently inserted into the function modelling the output from network. Identification and parameter redundancy issues for some particular cases are discussed.
0 references
neural network
0 references
multi layer perceptron neural network (MLP NN) architecture
0 references
Chebyshev polynomial
0 references
parameter redundancy
0 references
0 references