An efficient approach for computing optimal low-rank regularized inverse matrices
From MaRDI portal
Publication:2936503
DOI10.1088/0266-5611/30/11/114009zbMath1305.65130arXiv1404.1610OpenAlexW2139257007MaRDI QIDQ2936503
Matthias Chung, Julianne Chung
Publication date: 17 December 2014
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1404.1610
regularizationnumerical experimenttruncated singular value decompositionmachine learningBayes risklow-rank approximationill-posed inverse problemsempirical Bayes risk
Related Items (10)
Optimal Low-rank Approximations of Bayesian Linear Inverse Problems ⋮ Low-Rank Eigenvector Compression of Posterior Covariance Matrices for Linear Gaussian Inverse Problems ⋮ Learning regularization parameters for general-form Tikhonov ⋮ Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems ⋮ Effective implementation to reduce execution time of a low-rank matrix approximation problem ⋮ A general class of arbitrary order iterative methods for computing generalized inverses ⋮ Optimal regularized low rank inverse approximation ⋮ Regularized Computation of Approximate Pseudoinverse of Large Matrices Using Low-Rank Tensor Train Decompositions ⋮ Efficient Marginalization-Based MCMC Methods for Hierarchical Bayesian Inverse Problems ⋮ Optimal Regularized Inverse Matrices for Inverse Problems
This page was built for publication: An efficient approach for computing optimal low-rank regularized inverse matrices