Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation
DOI10.1214/aos/1018031265zbMath0957.62032OpenAlexW1581352608MaRDI QIDQ1568308
Wenxin Jiang, Martin A. Tanner
Publication date: 29 March 2001
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1018031265
maximum likelihood estimationHellinger distancemean square errorexponential familyKullback-Leibler divergenceapproximation ratehierarchical mixtures-of-experts
Nonparametric regression and quantile regression (62G08) Generalized linear models (logistic models) (62J12) Rate of convergence, degree of approximation (41A25)
Related Items (35)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Density estimation through convex combinations of densities: Approximation and estimation bounds
- Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation
- Bayesian Inference in Mixtures-of-Experts and Hierarchical Mixtures-of-Experts Models With an Application to Speech Recognition
- Error bounds for functional approximation and estimation using mixtures of experts
- Asymptotic Properties of Non-Linear Least Squares Estimators
This page was built for publication: Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation