Information Geometry of U-Boost and Bregman Divergence
From MaRDI portal
Publication:4832497
Recommendations
- Information geometry and statistical pattern recognition
- Logistic regression, AdaBoost and Bregman distances
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Boosting. Foundations and algorithms.
- A Bregman extension of quasi-Newton updates I: An information geometrical framework
Cites work
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Robust Blind Source Separation by Beta Divergence
- Robustifying AdaBoost by Adding the Naive Error Rate
- Universal approximation bounds for superpositions of a sigmoidal function
Cited in
(48)- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- Variational representations of annealing paths: Bregman information under monotonic embedding
- A Bregman extension of quasi-Newton updates I: An information geometrical framework
- A Novel Parameter Estimation Method for Boltzmann Machines
- Conformal mirror descent with logarithmic divergences
- Geometry of EM and related iterative algorithms
- Density estimation with minimization of U-divergence
- Information Divergence Geometry and the Application to Statistical Machine Learning
- Kernel density estimation by stagewise algorithm with a simple dictionary
- Some universal insights on divergences for statistics, machine learning and artificial intelligence
- The geometry of proper scoring rules
- Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions
- Projections with logarithmic divergences
- Entropic risk minimization for nonparametric estimation of mixing distributions
- Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds
- Robust parameter estimation with a small bias against heavy contamination
- A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties
- Statistical learning for species distribution models in ecological studies
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Integration of Stochastic Models by Minimizing α-Divergence
- Normalized estimating equation for robust parameter estimation
- Binary classification with a pseudo exponential model and its application for multi-task learning
- Deformed algebras and generalizations of independence on deformed exponential families
- Duality of maximum entropy and minimum divergence
- Statistical analysis of distance estimators with density differences and density ratios
- Information geometry approach to parameter estimation in hidden Markov model
- Minimum Divergence Methods in Statistical Machine Learning
- Entropy and divergence associated with power function and the statistical application
- A modified EM algorithm for mixture models based on Bregman divergence
- A Note on Divergences
- Re-examination of Bregman functions and new properties of their divergences
- Information geometry and statistical pattern recognition
- Affine invariant divergences associated with proper composite scoring rules and their applications
- Dually flat geometries of the deformed exponential family
- Boosting Method for Local Learning in Statistical Pattern Recognition
- A boosting method with asymmetric mislabeling probabilities which depend on covariates
- A new integrated discrimination improvement index via odds
- Graph-based composite local Bregman divergences on discrete sample spaces
- An extension of the receiver operating characteristic curve and AUC-optimal classification
- Deformation of log-likelihood loss function for multiclass boosting
- Robust Loss Functions for Boosting
- Hessian structures on deformed exponential families and their conformal structures
- A Multiclass Classification Method Based on Decoding of Binary Classifiers
- Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- An estimation of generalized Bradley-Terry models based on the EM algorithm
- Improved estimators of Bregman divergence for model selection in small samples
- A boosting method for maximization of the area under the ROC curve
This page was built for publication: Information Geometry of U-Boost and Bregman Divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4832497)