Should Penalized Least Squares Regression be Interpreted as Maximum A Posteriori Estimation?
From MaRDI portal
Publication:4572955
DOI10.1109/TSP.2011.2107908zbMath1392.94228OpenAlexW2141006018MaRDI QIDQ4572955
Publication date: 18 July 2018
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tsp.2011.2107908
Ridge regression; shrinkage estimators (Lasso) (62J07) Bayesian inference (62F15) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
On Bayesian estimation and proximity operators, Compressibility analysis of asymptotically mean stationary processes, Self-Supervised Deep Learning for Image Reconstruction: A Langevin Monte Carlo Approach, On maximum a posteriori estimation with Plug \& Play priors and stochastic gradient descent, PnP-ReG: Learned Regularizing Gradient for Plug-and-Play Gradient Descent, A characterization of proximity operators, Sparse estimation: an MMSE approach, Low Complexity Regularization of Linear Inverse Problems, On Bayesian posterior mean estimators in imaging sciences and Hamilton-Jacobi partial differential equations, Scalable Bayesian Uncertainty Quantification in Imaging Inverse Problems via Convex Optimization, Nonlinear Power Method for Computing Eigenvectors of Proximal Operators and Neural Networks, Regularization by Denoising via Fixed-Point Projection (RED-PRO), Least squares formulation for ill-posed inverse problems and applications