An efficient approach for computing optimal low-rank regularized inverse matrices
From MaRDI portal
Publication:2936503
DOI10.1088/0266-5611/30/11/114009zbMATH Open1305.65130arXiv1404.1610OpenAlexW2139257007MaRDI QIDQ2936503FDOQ2936503
Authors: Julianne Chung, Matthias Chung
Publication date: 17 December 2014
Published in: Inverse Problems (Search for Journal in Brave)
Abstract: Standard regularization methods that are used to compute solutions to ill-posed inverse problems require knowledge of the forward model. In many real-life applications, the forward model is not known, but training data is readily available. In this paper, we develop a new framework that uses training data, as a substitute for knowledge of the forward model, to compute an optimal low-rank regularized inverse matrix directly, allowing for very fast computation of a regularized solution. We consider a statistical framework based on Bayes and empirical Bayes risk minimization to analyze theoretical properties of the problem. We propose an efficient rank update approach for computing an optimal low-rank regularized inverse matrix for various error measures. Numerical experiments demonstrate the benefits and potential applications of our approach to problems in signal and image processing.
Full work available at URL: https://arxiv.org/abs/1404.1610
Recommendations
- Optimal regularized inverse matrices for inverse problems
- Optimal regularized low rank inverse approximation
- Optimal estimation of \(\ell_1\)-regularization prior from a regularized empirical Bayesian risk standpoint
- Compression approaches for the regularized solutions of linear systems from large-scale inverse problems
- Krylov methods for low-rank regularization
machine learningregularizationlow-rank approximationBayes risknumerical experimenttruncated singular value decompositionill-posed inverse problemsempirical Bayes risk
Cited In (12)
- Optimal regularized inverse matrices for inverse problems
- Fast random vector transforms in terms of pseudo-inverse within the Wiener filtering paradigm
- Optimal estimation of \(\ell_1\)-regularization prior from a regularized empirical Bayesian risk standpoint
- Regularized computation of approximate pseudoinverse of large matrices using low-rank tensor train decompositions
- Low-Rank Eigenvector Compression of Posterior Covariance Matrices for Linear Gaussian Inverse Problems
- Learning regularization parameters for general-form Tikhonov
- Optimal regularized low rank inverse approximation
- Goal-oriented optimal approximations of Bayesian linear inverse problems
- Optimal low-rank approximations of Bayesian linear inverse problems
- A general class of arbitrary order iterative methods for computing generalized inverses
- Efficient Marginalization-Based MCMC Methods for Hierarchical Bayesian Inverse Problems
- Effective implementation to reduce execution time of a low-rank matrix approximation problem
This page was built for publication: An efficient approach for computing optimal low-rank regularized inverse matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2936503)