Entropic risk minimization for nonparametric estimation of mixing distributions
From MaRDI portal
Publication:2347712
DOI10.1007/s10994-014-5467-7zbMath1320.62017OpenAlexW2103594587WikidataQ59404347 ScholiaQ59404347MaRDI QIDQ2347712
Publication date: 5 June 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-014-5467-7
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonparametric estimation (62G05) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Projective power entropy and maximum Tsallis entropy distributions
- Entropy and divergence associated with power function and the statistical application
- Robust parameter estimation with a small bias against heavy contamination
- The geometry of mixture likelihoods: A general theory
- Introduction to Nonextensive Statistical Mechanics
- Robust and efficient estimation by minimising a density power divergence
- Guessing subject to distortion
- Information Geometry of U-Boost and Bregman Divergence
- A mapping approach to rate-distortion computation and analysis
- Bayesian Reasoning and Machine Learning
This page was built for publication: Entropic risk minimization for nonparametric estimation of mixing distributions