Generalization bounds for metric and similarity learning
From MaRDI portal
Publication:255367
DOI10.1007/S10994-015-5499-7zbMATH Open1345.68250arXiv1207.5437OpenAlexW1490654344MaRDI QIDQ255367FDOQ255367
Qiong Cao, Zheng-Chu Guo, Yiming Ying
Publication date: 9 March 2016
Published in: Machine Learning (Search for Journal in Brave)
Abstract: Recently, metric learning and similarity learning have attracted a large amount of interest. Many models and optimisation algorithms have been proposed. However, there is relatively little work on the generalization analysis of such methods. In this paper, we derive novel generalization bounds of metric and similarity learning. In particular, we first show that the generalization analysis reduces to the estimation of the Rademacher average over "sums-of-i.i.d." sample-blocks related to the specific matrix norm. Then, we derive generalization bounds for metric/similarity learning with different matrix-norm regularisers by estimating their specific Rademacher complexities. Our analysis indicates that sparse metric/similarity learning with -norm regularisation could lead to significantly better bounds than those with Frobenius-norm regularisation. Our novel generalization analysis develops and refines the techniques of U-statistics and Rademacher complexity analysis.
Full work available at URL: https://arxiv.org/abs/1207.5437
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- 10.1162/153244302760200704
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Ranking and empirical minimization of \(U\)-statistics
- Large scale online learning of image similarity through ranking
- Rademacher Chaos Complexities for Learning the Kernel Problem
- 10.1162/153244303321897690
Cited In (22)
- Online pairwise learning algorithms with convex loss functions
- Online regularized learning with pairwise loss functions
- Uniform consistency and uniform in number of neighbors consistency for nonparametric regression estimates and conditional \(U\)-statistics involving functional data
- Self-supervised Metric Learning in Multi-View Data: A Downstream Task Perspective
- Generalization ability of online pairwise support vector machine
- Asymptotic properties of conditional U -statistics using delta sequences
- On the variable bandwidth kernel estimation of conditional \(U\)-statistics at optimal rates in sup-norm
- Convergence analysis of distributed multi-penalty regularized pairwise learning
- Structural, Syntactic, and Statistical Pattern Recognition
- Online regularized pairwise learning with least squares loss
- Convergence of online pairwise regression learning with quadratic loss
- Online Pairwise Learning Algorithms
- Error analysis of kernel regularized pairwise learning with a strongly convex loss
- A fast diagonal distance metric learning approach for large-scale datasets
- Pairwise learning problems with regularization networks and Nyström subsampling approach
- Weak convergence of the conditional U-statistics for locally stationary functional time series
- Online regularized pairwise learning with non-i.i.d. observations
- A formalization of the natural gradient method for general similarity measures
- Guaranteed Classification via Regularized Similarity Learning
- A metric entropy bound is not sufficient for learnability
- Fast generalization rates for distance metric learning. Improved theoretical analysis for smooth strongly convex distance metric learning
- Limit theorems for a class of processes generalizing the U -empirical process
This page was built for publication: Generalization bounds for metric and similarity learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q255367)