Integration of Stochastic Models by Minimizing α-Divergence
From MaRDI portal
Publication:5441322
DOI10.1162/neco.2007.19.10.2780zbMath1138.62002OpenAlexW2030524974WikidataQ80836076 ScholiaQ80836076MaRDI QIDQ5441322
Publication date: 11 February 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.10.2780
Learning and adaptive systems in artificial intelligence (68T05) Memory and learning in psychology (91E40) Statistical aspects of information-theoretic topics (62B10)
Related Items (16)
Non-linear canonical correlation analysis using alpha-beta divergence ⋮ The information geometry of Bregman divergences and some applications in multi-expert reasoning ⋮ On clustering histograms with \(k\)-means by using mixed \(\alpha\)-divergences ⋮ Parameter Estimation for α-GMM Based on Maximum Likelihood Criterion ⋮ A risk profile for information fusion algorithms ⋮ Quantum Hellinger distances revisited ⋮ Saddlepoint condition on a predictor to reconfirm the need for the assumption of a prior distribution ⋮ Quasi-arithmetic centers, quasi-arithmetic mixtures, and the Jensen-Shannon \(\nabla \)-divergences ⋮ Parameter Learning for Alpha Integration ⋮ Fusion of Scores in a Detection Context Based on Alpha Integration ⋮ A Note on Divergences ⋮ Unsupervised weight parameter estimation method for ensemble learning ⋮ Multiclass Alpha Integration of Scores from Multiple Classifiers ⋮ An Estimation of Generalized Bradley-Terry Models Based on the em Algorithm ⋮ CONFORMAL GEOMETRY OF ESCORT PROBABILITY AND ITS APPLICATIONS ⋮ A Unified Framework for Bayesian and Non-Bayesian Decision Making and Inference
Cites Work
- Second order efficiency of minimum contrast estimators in a curved exponential family
- I-divergence geometry of probability distributions and minimization problems
- Possible generalization of Boltzmann-Gibbs statistics.
- Illumination-invariance of plateau's midgray
- Training Products of Experts by Minimizing Contrastive Divergence
- On asymptotic properties of predictive distributions
- On the local geometry of mixture models
- The α-EM algorithm: surrogate likelihood maximization using α-logarithmic information measures
- Stochastic Reasoning, Free Energy, and Information Geometry
- Divergence Function, Duality, and Convex Analysis
- Information Geometry of U-Boost and Bregman Divergence
- A Generalized Bayes Rule for Prediction
- Means of Positive Numbers and Matrices
- GENERALIZATION OF THE MEAN-FIELD METHOD FOR POWER-LAW DISTRIBUTIONS
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
This page was built for publication: Integration of Stochastic Models by Minimizing α-Divergence