A conjugate gradient like method for \(p\)-norm minimization in functional spaces (Q1681792)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A conjugate gradient like method for \(p\)-norm minimization in functional spaces |
scientific article |
Statements
A conjugate gradient like method for \(p\)-norm minimization in functional spaces (English)
0 references
24 November 2017
0 references
The authors develop an iterative algorithm to recover minimum \(p\)-norm solutions of linear equations \(Ax=b\) between two Banach spaces, with \(X=L^p\), \(1< p <2\) and \(Y=L^r\), \(r>1\). The algorithm uses a linear combination of the steepest current ``descent'' direction and the previous descent direction. So it is a generalization of the classical conjugate gradient method on normal equations in Hilbert spaces. They show that the iterates converge to the minimum \(p\)-norm solution, and it is a regularization method under discrepancy principle. Some numerical results are given.
0 references
minimum \(p\)-norm solution, Banach space
0 references
conjugate gradient method
0 references
iterative algorithm
0 references
regularization method
0 references
discrepancy principle
0 references
numerical result
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references