Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

From MaRDI portal
Publication:6408745

arXiv2208.11846MaRDI QIDQ6408745FDOQ6408745

Christian Kümmerle, René Vidal, Liangzu Peng

Publication date: 24 August 2022

Abstract: We advance both the theory and practice of robust ellp-quasinorm regression for pin(0,1] by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, p=1, we prove that this IRLS variant converges globally at a linear rate under a mild, deterministic condition on the feature matrix called the extit{stable range space property}. In the non-convex case, pin(0,1), we prove that under a similar condition, IRLS converges locally to the global minimizer at a superlinear rate of order 2p; the rate becomes quadratic as po0. We showcase the proposed methods in three applications: real phase retrieval, regression without correspondences, and robust face restoration. The results show that (1) IRLS can handle a larger number of outliers than other methods, (2) it is faster than competing methods at the same level of accuracy, (3) it restores a sparsely corrupted face image with satisfactory visual quality. https://github.com/liangzu/IRLS-NeurIPS2022




Has companion code repository: https://github.com/liangzu/irls-neurips2022









This page was built for publication: Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6408745)