The Role of Principal Angles in Subspace Classification

From MaRDI portal
Publication:4622015

DOI10.1109/TSP.2015.2500889zbMATH Open1414.94250arXiv1507.04230MaRDI QIDQ4622015FDOQ4622015


Authors: Jiaji Huang, Qiang Qiu, Robert Calderbank Edit this on Wikidata


Publication date: 8 February 2019

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Abstract: Subspace models play an important role in a wide range of signal processing tasks, and this paper explores how the pairwise geometry of subspaces influences the probability of misclassification. When the mismatch between the signal and the model is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. The transform presented here (TRAIT) preserves some specific characteristic of each individual class, and this approach is shown to be complementary to a previously developed transform (LRT) that enlarges inter-class distance while suppressing intra-class dispersion. Theoretical results are supported by demonstration of superior classification accuracy on synthetic and measured data even in the presence of significant model mismatch.


Full work available at URL: https://arxiv.org/abs/1507.04230




Recommendations




Cited In (2)





This page was built for publication: The Role of Principal Angles in Subspace Classification

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4622015)