Local dimensionality reduction (Q1979103)
From MaRDI portal
!
WARNING
This is the item page for this Wikibase entity, intended for internal use and editing purposes.
Please use the normal view instead:
scientific article; zbMATH DE number 1452449
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Local dimensionality reduction |
scientific article; zbMATH DE number 1452449 |
Statements
Local dimensionality reduction (English)
0 references
24 May 2000
0 references
Different methods of dimensionality reduction such as principal components and Fisher's linear discriminant (FLD) are considered. The authors are interested in local versions of these methods based on normal mixtures and nearest neighbors approach. The Iterated Nearest Neighbor FLD (INN) is an example of such methods. Suppose, that a training sample of two classes is given. To test an observation \(x_0\) by INN the following algorithm is proposed: 1. The nearest (in Mahalonobis distance) neighbors of \(x_0\) are selected in each class (\(p_1\) and \(p_2\)). 2. \(k\) observations from the \(i\)-th class nearest to \(p_i\) are selected for \(i=1,2\). 3. The obtained subsample is used to estimate the local correlation matrix \(S\). 4. The Mahalonobis distances are recalculated using \(S\). 5. The steps 1-4 are iterated. 6. The resulting \(S\) and subsamples are used to calculate the FLD. The authors discuss the performance of such algorithms and present simulation results.
0 references
dimensionality reduction
0 references
normal finite mixture
0 references
nearest neighbors
0 references
local linear discriminant analysis
0 references
0.7272810339927673
0 references
0.7081415057182312
0 references
0.7043594717979431
0 references
0.7033875584602356
0 references