Information Geometry of U-Boost and Bregman Divergence
From MaRDI portal
Publication:4832497
DOI10.1162/089976604323057452zbMath1102.68489OpenAlexW2106579555WikidataQ51670214 ScholiaQ51670214MaRDI QIDQ4832497
Takashi Takenouchi, Noboru Murata, Takafumi Kanamori, Shinto Eguchi
Publication date: 4 January 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976604323057452
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (41)
Dually flat geometries of the deformed exponential family ⋮ Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds ⋮ Binary classification with a pseudo exponential model and its application for multi-task learning ⋮ Deformed algebras and generalizations of independence on deformed exponential families ⋮ Statistical analysis of distance estimators with density differences and density ratios ⋮ Duality of maximum entropy and minimum divergence ⋮ Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma ⋮ Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm ⋮ A modified EM algorithm for mixture models based on Bregman divergence ⋮ The geometry of proper scoring rules ⋮ Graph-based composite local Bregman divergences on discrete sample spaces ⋮ Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation ⋮ A Multiclass Classification Method Based on Decoding of Binary Classifiers ⋮ Robust Boosting Algorithm Against Mislabeling in Multiclass Problems ⋮ Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions ⋮ Geometry of EM and related iterative algorithms ⋮ Conformal mirror descent with logarithmic divergences ⋮ Density estimation with minimization of \(U\)-divergence ⋮ Normalized estimating equation for robust parameter estimation ⋮ Re-examination of Bregman functions and new properties of their divergences ⋮ A boosting method for maximization of the area under the ROC curve ⋮ A Novel Parameter Estimation Method for Boltzmann Machines ⋮ A Note on Divergences ⋮ Boosting Method for Local Learning in Statistical Pattern Recognition ⋮ Entropy and divergence associated with power function and the statistical application ⋮ Affine invariant divergences associated with proper composite scoring rules and their applications ⋮ A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties ⋮ An Estimation of Generalized Bradley-Terry Models Based on the em Algorithm ⋮ Robust parameter estimation with a small bias against heavy contamination ⋮ A Bregman extension of quasi-Newton updates I: an information geometrical framework ⋮ Robust Loss Functions for Boosting ⋮ Integration of Stochastic Models by Minimizing α-Divergence ⋮ Deformation of log-likelihood loss function for multiclass boosting ⋮ A boosting method with asymmetric mislabeling probabilities which depend on covariates ⋮ IMPROVED ESTIMATORS OF BREGMAN DIVERGENCE FOR MODEL SELECTION IN SMALL SAMPLES ⋮ Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence ⋮ Information geometry approach to parameter estimation in hidden Markov model ⋮ An Extension of the Receiver Operating Characteristic Curve and AUC-Optimal Classification ⋮ Entropic risk minimization for nonparametric estimation of mixing distributions ⋮ Hessian structures on deformed exponential families and their conformal structures ⋮ Projections with logarithmic divergences
Cites Work
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Robust Blind Source Separation by Beta Divergence
- Universal approximation bounds for superpositions of a sigmoidal function
- Robustifying AdaBoost by Adding the Naive Error Rate
This page was built for publication: Information Geometry of U-Boost and Bregman Divergence