Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints
From MaRDI portal
Publication:2369641
DOI10.1016/J.PATCOG.2005.12.004zbMATH Open1158.68483OpenAlexW2142632277MaRDI QIDQ2369641FDOQ2369641
Authors: Dit-Yan Yeung, Hong Chang
Publication date: 22 May 2006
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: http://repository.ust.hk/ir/bitstream/1783.1-2482/1/yeung.pr2006a.pdf
Recommendations
- scientific article; zbMATH DE number 7559245
- Positive semidefinite metric learning using boosting-like algorithms
- scientific article; zbMATH DE number 5957283
- Semi-supervised distance metric learning in high-dimensional spaces by using equivalence constraints
- Constrained Metric Learning by Permutation Inducing Isometries
- Non-linear metric learning using pairwise similarity and dissimilarity constraints and the geometrical structure of data
Cites Work
Cited In (9)
- Title not available (Why is that?)
- A Novel Semi-supervised Clustering Algorithm for Finding Clusters of Arbitrary Shapes
- Ordinal margin metric learning and its extension for cross-distribution image data
- Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds
- Semi-supervised clustering with metric learning: an adaptive kernel method
- Non-linear metric learning using pairwise similarity and dissimilarity constraints and the geometrical structure of data
- Distance metric learning guided adaptive subspace semi-supervised clustering
- Learning a Mahalanobis distance metric for data clustering and classification
- Supervised distance metric learning through maximization of the Jeffrey divergence
This page was built for publication: Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2369641)