A note on the identifiability of the conditional expectation for the mixtures of neural networks
DOI10.1016/J.SPL.2007.09.038zbMATH Open1137.62368OpenAlexW2018373007MaRDI QIDQ2483449FDOQ2483449
Authors: Jean-Pierre Stockis, Joseph Tadjuidje Kamgaing, Jürgen Franke
Publication date: 28 April 2008
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2007.09.038
Recommendations
- Mixture of experts architectures for neural networks as a special case of conditional expectation formula.
- Mixture Models Based on Neural Network Averaging
- Modeling nonlinearities with mixtures-of-experts of time series models
- scientific article; zbMATH DE number 1304656
- Neural network identifiability for a family of sigmoidal nonlinearities
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Neural nets and related approaches to inference from stochastic processes (62M45) Identification in stochastic control theory (93E12)
Cites Work
Cited In (6)
- Mixtures of nonparametric autoregressions
- Neural network identifiability for a family of sigmoidal nonlinearities
- Mixture Models Based on Neural Network Averaging
- On geometric ergodicity of CHARME models
- Autoregressive processes with data-driven regime switching
- Likelihood ratio of unidentifiable models and multilayer neural networks
This page was built for publication: A note on the identifiability of the conditional expectation for the mixtures of neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2483449)