Information Geometry of U-Boost and Bregman Divergence

From MaRDI portal
Publication:4832497

DOI10.1162/089976604323057452zbMath1102.68489OpenAlexW2106579555WikidataQ51670214 ScholiaQ51670214MaRDI QIDQ4832497

Takashi Takenouchi, Noboru Murata, Takafumi Kanamori, Shinto Eguchi

Publication date: 4 January 2005

Published in: Neural Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1162/089976604323057452




Related Items (41)

Dually flat geometries of the deformed exponential familyNonparametric information geometry: from divergence function to referential-representational biduality on statistical manifoldsBinary classification with a pseudo exponential model and its application for multi-task learningDeformed algebras and generalizations of independence on deformed exponential familiesStatistical analysis of distance estimators with density differences and density ratiosDuality of maximum entropy and minimum divergenceInterpreting Kullback--Leibler divergence with the Neyman-Pearson LemmaTutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithmA modified EM algorithm for mixture models based on Bregman divergenceThe geometry of proper scoring rulesGraph-based composite local Bregman divergences on discrete sample spacesDensity-ratio matching under the Bregman divergence: a unified framework of density-ratio estimationA Multiclass Classification Method Based on Decoding of Binary ClassifiersRobust Boosting Algorithm Against Mislabeling in Multiclass ProblemsTernary Bradley-Terry model-based decoding for multi-class classification and its extensionsGeometry of EM and related iterative algorithmsConformal mirror descent with logarithmic divergencesDensity estimation with minimization of \(U\)-divergenceNormalized estimating equation for robust parameter estimationRe-examination of Bregman functions and new properties of their divergencesA boosting method for maximization of the area under the ROC curveA Novel Parameter Estimation Method for Boltzmann MachinesA Note on DivergencesBoosting Method for Local Learning in Statistical Pattern RecognitionEntropy and divergence associated with power function and the statistical applicationAffine invariant divergences associated with proper composite scoring rules and their applicationsA Bregman extension of quasi-Newton updates. II: Analysis of robustness propertiesAn Estimation of Generalized Bradley-Terry Models Based on the em AlgorithmRobust parameter estimation with a small bias against heavy contaminationA Bregman extension of quasi-Newton updates I: an information geometrical frameworkRobust Loss Functions for BoostingIntegration of Stochastic Models by Minimizing α-DivergenceDeformation of log-likelihood loss function for multiclass boostingA boosting method with asymmetric mislabeling probabilities which depend on covariatesIMPROVED ESTIMATORS OF BREGMAN DIVERGENCE FOR MODEL SELECTION IN SMALL SAMPLESSome Universal Insights on Divergences for Statistics, Machine Learning and Artificial IntelligenceInformation geometry approach to parameter estimation in hidden Markov modelAn Extension of the Receiver Operating Characteristic Curve and AUC-Optimal ClassificationEntropic risk minimization for nonparametric estimation of mixing distributionsHessian structures on deformed exponential families and their conformal structuresProjections with logarithmic divergences



Cites Work


This page was built for publication: Information Geometry of U-Boost and Bregman Divergence