Integration of Stochastic Models by Minimizing α-Divergence
From MaRDI portal
Publication:5441322
DOI10.1162/neco.2007.19.10.2780zbMath1138.62002WikidataQ80836076 ScholiaQ80836076MaRDI QIDQ5441322
Publication date: 11 February 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.10.2780
68T05: Learning and adaptive systems in artificial intelligence
91E40: Memory and learning in psychology
62B10: Statistical aspects of information-theoretic topics
Related Items
CONFORMAL GEOMETRY OF ESCORT PROBABILITY AND ITS APPLICATIONS, Multiclass Alpha Integration of Scores from Multiple Classifiers, Parameter Estimation for α-GMM Based on Maximum Likelihood Criterion, Parameter Learning for Alpha Integration, Fusion of Scores in a Detection Context Based on Alpha Integration, A Note on Divergences, Non-linear canonical correlation analysis using alpha-beta divergence, The information geometry of Bregman divergences and some applications in multi-expert reasoning, On clustering histograms with \(k\)-means by using mixed \(\alpha\)-divergences, A risk profile for information fusion algorithms, Saddlepoint condition on a predictor to reconfirm the need for the assumption of a prior distribution, Unsupervised weight parameter estimation method for ensemble learning, Quantum Hellinger distances revisited, An Estimation of Generalized Bradley-Terry Models Based on the em Algorithm
Cites Work
- Second order efficiency of minimum contrast estimators in a curved exponential family
- I-divergence geometry of probability distributions and minimization problems
- Possible generalization of Boltzmann-Gibbs statistics.
- Illumination-invariance of plateau's midgray
- Training Products of Experts by Minimizing Contrastive Divergence
- On asymptotic properties of predictive distributions
- On the local geometry of mixture models
- The α-EM algorithm: surrogate likelihood maximization using α-logarithmic information measures
- Stochastic Reasoning, Free Energy, and Information Geometry
- Divergence Function, Duality, and Convex Analysis
- Information Geometry of U-Boost and Bregman Divergence
- A Generalized Bayes Rule for Prediction
- Means of Positive Numbers and Matrices
- GENERALIZATION OF THE MEAN-FIELD METHOD FOR POWER-LAW DISTRIBUTIONS
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations