Nonlinear residual minimization by iteratively reweighted least squares
From MaRDI portal
Publication:301685
DOI10.1007/S10589-016-9829-XzbMATH Open1373.90153arXiv1504.06815OpenAlexW2110071746MaRDI QIDQ301685FDOQ301685
Authors: Juliane Sigl
Publication date: 1 July 2016
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: We address the numerical solution of minimal norm residuals of {it nonlinear} equations in finite dimensions. We take inspiration from the problem of finding a sparse vector solution by using greedy algorithms based on iterative residual minimizations in the -norm, for . Due to the mild smoothness of the problem, especially for , we develop and analyze a generalized version of Iteratively Reweighted Least Squares (IRLS). This simple and efficient algorithm performs the solution of optimization problems involving non-quadratic possibly non-convex and non-smooth cost functions, which can be transformed into a sequence of common least squares problems, which can be tackled more efficiently.While its analysis has been developed in many contexts when the model equation is {it linear}, no results are provided in the {it nonlinear} case. We address the convergence and the rate of error decay of IRLS for nonlinear problems. The convergence analysis is based on its reformulation as an alternating minimization of an energy functional, whose variables are the competitors to solutions of the intermediate reweighted least squares problems. Under specific conditions of coercivity and local convexity, we are able to show convergence of IRLS to minimizers of the nonlinear residual problem. For the case where we are lacking local convexity, we propose an appropriate convexification.. To illustrate the theoretical results we conclude the paper with several numerical experiments. We compare IRLS with standard Matlab functions for an easily presentable example and numerically validate our theoretical results in the more complicated framework of phase retrieval problems. Finally we examine the recovery capability of the algorithm in the context of data corrupted by impulsive noise where the sparsification of the residual is desired.
Full work available at URL: https://arxiv.org/abs/1504.06815
Recommendations
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- Iteratively reweighted least squares minimization for sparse recovery
- An iteratively reweighted least squares algorithm for sparse regularization
- Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming
- Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery
Cites Work
- Nonlinear total variation based noise removal algorithms
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- A method for the solution of certain non-linear problems in least squares
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Image recovery via total variation minimization and related problems
- Bregman distances, totally convex functions, and a method for solving operator equations in Banach spaces
- Title not available (Why is that?)
- Generalized convexity, nonsmooth variational inequalities, and nonsmooth optimization
- Restricted isometry properties and nonconvex compressive sensing
- Iteratively reweighted least squares minimization for sparse recovery
- Fast, robust total variation-based reconstruction of noisy, blurred images
- A predictor-corrector algorithm for the coupling of stiff ODEs to a particle population balance
- On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search
- Linearly constrained nonsmooth and nonconvex minimization
- Low-rank matrix recovery via iteratively reweighted least squares minimization
- Title not available (Why is that?)
- Accelerated Gauss-Newton algorithms for nonlinear least squares problems
- An Adaptive Nonlinear Least-Squares Algorithm
- Algorithms for the Solution of the Nonlinear Least-Squares Problem
- Semismooth Newton Methods for Operator Equations in Function Spaces
- Rate of Convergence of Lawson's Algorithm
- Quasi-linear compressed sensing
- On iteratively reweighted algorithms for nonsmooth nonconvex optimization in computer vision
- Convergence of an adaptive Kačanov FEM for quasi-linear problems
Cited In (8)
- The method IRLs for some best \(\ell_p\) norm solutions of under- or overdetermined linear systems
- Regularization of geophysical ill-posed problems by iteratively re-weighted and refined least squares
- An algorithm for real and complex rational minimax approximation
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- Efficient iterative solutions to complex-valued nonlinear least-squares problems with mixed linear and antilinear operators
- A novel dictionary learning method based on total least squares approach with application in high dimensional biological data
- Robustness by reweighting for kernel estimators: an overview
- An efficient code for the minimization of highly nonlinear and large residual least squares functions
Uses Software
This page was built for publication: Nonlinear residual minimization by iteratively reweighted least squares
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q301685)