One-pass AUC optimization
From MaRDI portal
Publication:286076
DOI10.1016/J.ARTINT.2016.03.003zbMATH Open1357.68168arXiv1305.1363OpenAlexW2299684943MaRDI QIDQ286076FDOQ286076
Authors: Wei Gao, Lu Wang, Rong Jin, Shenghuo Zhu, Zhi-Hua Zhou
Publication date: 19 May 2016
Published in: Artificial Intelligence (Search for Journal in Brave)
Abstract: AUC is an important performance measure and many algorithms have been devoted to AUC optimization, mostly by minimizing a surrogate convex loss on a training data set. In this work, we focus on one-pass AUC optimization that requires only going through the training data once without storing the entire training dataset, where conventional online learning algorithms cannot be applied directly because AUC is measured by a sum of losses defined over pairs of instances from different classes. We develop a regression-based algorithm which only needs to maintain the first and second order statistics of training data in memory, resulting a storage requirement independent from the size of training data. To efficiently handle high dimensional data, we develop a randomized algorithm that approximates the covariance matrices by low rank matrices. We verify, both theoretically and empirically, the effectiveness of the proposed algorithm.
Full work available at URL: https://arxiv.org/abs/1305.1363
Recommendations
- An online AUC formulation for binary classification
- Dual coordinate descent methods for solving AUC optimization problem
- Stochastic AUC optimization with general loss
- Maximization of AUC and buffered AUC in binary classification
- Support vector algorithms for optimizing the partial area under the ROC curve
Cites Work
- Pegasos: primal estimated sub-gradient solver for SVM
- Nonparametric and semiparametric estimation of the receiver operating characteristic curve
- Title not available (Why is that?)
- Prediction, Learning, and Games
- Probability Inequalities for Sums of Bounded Random Variables
- Title not available (Why is that?)
- Measuring classifier performance: a coherent alternative to the area under the ROC curve
- Title not available (Why is that?)
- Robust classification for imprecise environments
- Ranking and empirical minimization of \(U\)-statistics
- Weighted sums of certain dependent random variables
- Generalization bounds for ranking algorithms via algorithmic stability
- Margin-based ranking and an equivalence between AdaBoost and RankBoost
- Generalization bounds for the area under the ROC curve
- 10.1162/1532443041827916
- Logarithmic Regret Algorithms for Online Convex Optimization
- Learning Theory
Cited In (8)
- Learning with mitigating random consistency from the accuracy measure
- An online AUC formulation for binary classification
- Semi-supervised AUC optimization based on positive-unlabeled learning
- Optimizing area under the ROC curve using semi-supervised learning
- Stability and optimization error of stochastic gradient descent for pairwise learning
- Stochastic AUC optimization with general loss
- Title not available (Why is that?)
- Approximate reduction from AUC maximization to 1-norm soft margin optimization
Uses Software
This page was built for publication: One-pass AUC optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q286076)