Off-the-grid learning of sparse mixtures from a continuous dictionary

From MaRDI portal
Publication:6403703

arXiv2207.00171MaRDI QIDQ6403703FDOQ6403703


Authors: Cristina Butucea, Jean-François Delmas, Anne Dutfoy, Clément Hardy Edit this on Wikidata


Publication date: 29 June 2022

Abstract: We consider a general non-linear model where the signal is a finite mixture of an unknown, possibly increasing, number of features issued from a continuous dictionary parameterized by a real nonlinear parameter. The signal is observed with Gaussian (possibly correlated) noise in either a continuous or a discrete setup. We propose an off-the-grid optimization method, that is, a method which does not use any discretization scheme on the parameter space, to estimate both the non-linear parameters of the features and the linear parameters of the mixture. We use recent results on the geometry of off-the-grid methods to give minimal separation on the true underlying non-linear parameters such that interpolating certificate functions can be constructed. Using also tail bounds for suprema of Gaussian processes we bound the prediction error with high probability. Assuming that the certificate functions can be constructed, our prediction error bound is up to log --factors similar to the rates attained by the Lasso predictor in the linear regression model. We also establish convergence rates that quantify with high probability the quality of estimation for both the linear and the non-linear parameters.













This page was built for publication: Off-the-grid learning of sparse mixtures from a continuous dictionary

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6403703)