Trace-penalty minimization for large-scale eigenspace computation (Q2398478)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Trace-penalty minimization for large-scale eigenspace computation |
scientific article |
Statements
Trace-penalty minimization for large-scale eigenspace computation (English)
0 references
16 August 2017
0 references
The eigenspace corresponding to the \(k\) smallest eigenvalues of a large-scale matrix pencil \((A,B)\) with symmetric matrices of size \(n\times n\) (and \(B\) positive definite) is the solution of the constrained problem \(\min_X \mathrm{tr}(X^TAX)\) with \(X\in\mathbb{R}^{n\times k}\) satisfying \(X^TBX=I\). It is proposed to solve this problem by minimizing \(f_\mu(X)=\frac{1}{2}\mathrm{tr}(X^TAX)+\frac{\mu}{4}\|X^TBX-I\|_F\) using standard methods. A key result shown is that \(\mu\) need not be very large, but that if suffices to choose \(\max(0,\lambda_k)<\mu<\lambda_n\) where \(\lambda_1\leq\lambda_2\leq\cdots\leq\lambda_n\) are the sorted eigenvalues of \(A\) for there to be essentially only one minimum that is global. Algorithmic aspects of a classical steepest descent minimization procedure are discussed. This includes restarts, the choice of the parameter \(\mu\), and stopping criteria. Also the complexity of the algorithm is discussed and it is extensively tested on numerical examples.
0 references
eigenvalue computation
0 references
exact quadratic penalty approach
0 references
gradient methods
0 references
large-scale matrix pencil
0 references
steepest descent minimization
0 references
complexity
0 references
numerical example
0 references