Generalization bounds for metric and similarity learning (Q255367)

From MaRDI portal
Revision as of 13:42, 11 July 2024 by ReferenceBot (talk | contribs) (‎Changed an Item)





scientific article
Language Label Description Also known as
English
Generalization bounds for metric and similarity learning
scientific article

    Statements

    Generalization bounds for metric and similarity learning (English)
    0 references
    0 references
    0 references
    0 references
    9 March 2016
    0 references
    Metric and similarity learning aims to find a distance metric or similarity suitable for the problem at hand, which forms a key foundation for many machine learning algorithms built on the concepts of distance metric and similarity. This paper gives a thorough and comprehensive study on the generalization analysis for metric and similarity learning by showing how distance metric/similarity minimizing the regularized empirical error would behave when used for prediction. The authors introduce a novel Rademacher complexity for metric learning and show how to estimate it for different matrix-norm regularization schemes. The resulting generalization error bounds also indicate the superiority of sparse \(L^1\)-norm regularization over Frobenius regularization in dealing with high-dimensional data, consistent with a phenomenon often observed in practice.
    0 references
    0 references
    metric learning
    0 references
    similarity learning
    0 references
    generalization bound
    0 references
    Rademacher complexity
    0 references

    Identifiers