A sparse optimization approach to infinite infimal convolution regularization
From MaRDI portal
Publication:6433429
arXiv2304.08628MaRDI QIDQ6433429FDOQ6433429
Marcello Carioni, M. Holler, Carola-Bibiane Schönlieb, K. Bredies, Yury Korolev
Publication date: 17 April 2023
Abstract: In this paper we introduce the class of infinite infimal convolution functionals and apply these functionals to the regularization of ill-posed inverse problems. The proposed regularization involves an infimal convolution of a continuously parametrized family of convex, positively one-homogeneous functionals defined on a common Banach space . We show that, under mild assumptions, this functional admits an equivalent convex lifting in the space of measures with values in . This reformulation allows us to prove well-posedness of a Tikhonov regularized inverse problem and opens the door to a sparse analysis of the solutions. In the case of finite-dimensional measurements we prove a representer theorem, showing that there exists a solution of the inverse problem that is sparse, in the sense that it can be represented as a linear combination of the extremal points of the ball of the lifted infinite infimal convolution functional. Then, we design a generalized conditional gradient method for computing solutions of the inverse problem without relying on an a priori discretization of the parameter space and of the Banach space . The iterates are constructed as linear combinations of the extremal points of the lifted infinite infimal convolution functional. We prove a sublinear rate of convergence for our algorithm and apply it to denoising of signals and images using, as regularizer, infinite infimal convolutions of fractional-Laplacian-type operators with adaptive orders of smoothness and anisotropies.
This page was built for publication: A sparse optimization approach to infinite infimal convolution regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6433429)