Likelihood ratio of unidentifiable models and multilayer neural networks
From MaRDI portal
Publication:1412367
DOI10.1214/aos/1056562464zbMath1032.62020OpenAlexW2086236873MaRDI QIDQ1412367
Publication date: 10 November 2003
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1056562464
Asymptotic properties of parametric estimators (62F12) Point estimation (62F10) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (11)
Relation between weight size and degree of over-fitting in neural network regression ⋮ Parameter Identifiability in Statistical Machine Learning: A Review ⋮ Difficulty of Singularity in Population Coding ⋮ Asymptotics for regression models under loss of identifiability ⋮ A goodness-of-fit test based on neural network sieve estimators ⋮ Density estimation by the penalized combinatorial method ⋮ Tutorial on brain-inspired computing. II: Multilayer perceptron and natural gradient learning ⋮ Consistent estimation of the architecture of multilayer perceptrons. ⋮ Dynamics of Learning Near Singularities in Layered Networks ⋮ Estimation and tests in finite mixture models of nonparametric densities ⋮ Variational Bayes Solution of Linear Neural Networks and Its Generalization Performance
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Testing the order of a model using locally conic parametrization: Population mixtures and stationary ARMA processes
- ASYMPTOTIC DISTRIBUTIONS OF LIKELIHOOD RATIOS FOR OVERPARAMETRIZED ARMA PROCESSES
- Testing in locally conic models, and application to mixture models
- The likelihood ratio test for the number of components in a mixture with Markov regime
- On the Assumptions Used to Prove Asymptotic Normality of Maximum Likelihood Estimates
- Note on the Consistency of the Maximum Likelihood Estimate
This page was built for publication: Likelihood ratio of unidentifiable models and multilayer neural networks