Optimal expansion of subspaces for eigenvector approximations (Q2469518): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Changed an Item |
||
Property / describes a project that uses | |||
Property / describes a project that uses: JDQZ / rank | |||
Normal rank |
Revision as of 02:14, 28 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Optimal expansion of subspaces for eigenvector approximations |
scientific article |
Statements
Optimal expansion of subspaces for eigenvector approximations (English)
0 references
6 February 2008
0 references
The author considers the problem of optimal expansion of a subspace for approximating an eigenvector. More theoretically, the problem is, given a subspace \(\mathcal{S}\), what vector \(v \in \mathcal{S}\) is such that \(\mathcal{S} + \mathrm{span}\{ Av \}\) is optimal for approximating a given unit eigenvector \(x\). The quality of the subspace \(\mathcal{S}\) is measured by the angle \(\cos(\theta(x,\mathcal{S}))\) defined by \[ \cos(\theta(x,\mathcal{S})) := \max_{z \in \mathcal{S}} \cos(\theta(x,z))= \max_{0 \neq z \in \mathcal{S}} \frac{| x^T z| }{\| x\| \| z\| }. \] The optimal subspace expansion problem is to find \(v \in \mathcal{S}\) that maximizes \(\cos(\theta(x,\mathcal{S} +\mathrm{span}\{ Av \}))\). The author presents a theoretical solution to that problem. This solution is not computable since it depends on the unknown eigenvector \(x\). Nevertheless, the author presents some hints that may suggest that the Ritz vector is a good choice as a vector for expansion.
0 references
Eigenvector approximations
0 references
Projection methods
0 references
Subspace expansion
0 references
Ritz vector
0 references