Information, Divergence and Risk for Binary Experiments
From MaRDI portal
Publication:5396625
zbMath1280.68192arXiv0901.0356MaRDI QIDQ5396625
Mark D. Reid, Robert C. Williamson
Publication date: 3 February 2014
Full work available at URL: https://arxiv.org/abs/0901.0356
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Surrogate regret bounds for generalized classification performance metrics ⋮ Unnamed Item ⋮ The Bregman Proximal Average ⋮ Calibrating sufficiently ⋮ A Proximal Average for Prox-Bounded Functions ⋮ A historical perspective on Schützenberger-Pinsker inequalities ⋮ Calibrated asymmetric surrogate losses ⋮ Minimax optimal sequential hypothesis tests for Markov processes ⋮ Divergence for \(s\)-concave and log concave functions ⋮ Support vector machines based on convex risk functions and general norms ⋮ Multiclass classification, information, divergence and surrogate risk ⋮ Mixedf-divergence and inequalities for log-concave functions ⋮ Unnamed Item ⋮ A Steiner formula in the \(L_p\) Brunn Minkowski theory ⋮ Forecaster's dilemma: extreme events and forecast evaluation ⋮ Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence ⋮ Generalized twin Gaussian processes using Sharma-Mittal divergence ⋮ A Spectral Estimation Framework for Phase Retrieval via Bregman Divergence Minimization
Uses Software
This page was built for publication: Information, Divergence and Risk for Binary Experiments