Information Geometry of U-Boost and Bregman Divergence
DOI10.1162/089976604323057452zbMATH Open1102.68489DBLPjournals/neco/MurataTKE04OpenAlexW2106579555WikidataQ51670214 ScholiaQ51670214MaRDI QIDQ4832497FDOQ4832497
Authors: Noboru Murata, Takashi Takenouchi, Takafumi Kanamori, Shinto Eguchi
Publication date: 4 January 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976604323057452
Recommendations
- Information geometry and statistical pattern recognition
- Logistic regression, AdaBoost and Bregman distances
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Boosting. Foundations and algorithms.
- A Bregman extension of quasi-Newton updates I: An information geometrical framework
Learning and adaptive systems in artificial intelligence (68T05) Computational learning theory (68Q32)
Cites Work
- Universal approximation bounds for superpositions of a sigmoidal function
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Robust Blind Source Separation by Beta Divergence
- Robustifying AdaBoost by Adding the Naive Error Rate
Cited In (48)
- Variational representations of annealing paths: Bregman information under monotonic embedding
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- A Bregman extension of quasi-Newton updates I: An information geometrical framework
- A Novel Parameter Estimation Method for Boltzmann Machines
- Conformal mirror descent with logarithmic divergences
- Geometry of EM and related iterative algorithms
- Information Divergence Geometry and the Application to Statistical Machine Learning
- Kernel density estimation by stagewise algorithm with a simple dictionary
- Density estimation with minimization of \(U\)-divergence
- Some universal insights on divergences for statistics, machine learning and artificial intelligence
- The geometry of proper scoring rules
- Projections with logarithmic divergences
- Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions
- Entropic risk minimization for nonparametric estimation of mixing distributions
- Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds
- Statistical learning for species distribution models in ecological studies
- A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties
- Robust parameter estimation with a small bias against heavy contamination
- Integration of Stochastic Models by Minimizing α-Divergence
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Normalized estimating equation for robust parameter estimation
- Binary classification with a pseudo exponential model and its application for multi-task learning
- Deformed algebras and generalizations of independence on deformed exponential families
- Duality of maximum entropy and minimum divergence
- Statistical analysis of distance estimators with density differences and density ratios
- Minimum Divergence Methods in Statistical Machine Learning
- Information geometry approach to parameter estimation in hidden Markov model
- Entropy and divergence associated with power function and the statistical application
- A Note on Divergences
- A modified EM algorithm for mixture models based on Bregman divergence
- Re-examination of Bregman functions and new properties of their divergences
- Information geometry and statistical pattern recognition
- Affine invariant divergences associated with proper composite scoring rules and their applications
- Boosting Method for Local Learning in Statistical Pattern Recognition
- Dually flat geometries of the deformed exponential family
- A new integrated discrimination improvement index via odds
- A boosting method with asymmetric mislabeling probabilities which depend on covariates
- Graph-based composite local Bregman divergences on discrete sample spaces
- An extension of the receiver operating characteristic curve and AUC-optimal classification
- Robust Loss Functions for Boosting
- Deformation of log-likelihood loss function for multiclass boosting
- Hessian structures on deformed exponential families and their conformal structures
- A Multiclass Classification Method Based on Decoding of Binary Classifiers
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma
- An estimation of generalized Bradley-Terry models based on the EM algorithm
- Improved estimators of Bregman divergence for model selection in small samples
- A boosting method for maximization of the area under the ROC curve
This page was built for publication: Information Geometry of U-Boost and Bregman Divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4832497)