Nonlinear residual minimization by iteratively reweighted least squares
From MaRDI portal
(Redirected from Publication:301685)
Abstract: We address the numerical solution of minimal norm residuals of {it nonlinear} equations in finite dimensions. We take inspiration from the problem of finding a sparse vector solution by using greedy algorithms based on iterative residual minimizations in the -norm, for . Due to the mild smoothness of the problem, especially for , we develop and analyze a generalized version of Iteratively Reweighted Least Squares (IRLS). This simple and efficient algorithm performs the solution of optimization problems involving non-quadratic possibly non-convex and non-smooth cost functions, which can be transformed into a sequence of common least squares problems, which can be tackled more efficiently.While its analysis has been developed in many contexts when the model equation is {it linear}, no results are provided in the {it nonlinear} case. We address the convergence and the rate of error decay of IRLS for nonlinear problems. The convergence analysis is based on its reformulation as an alternating minimization of an energy functional, whose variables are the competitors to solutions of the intermediate reweighted least squares problems. Under specific conditions of coercivity and local convexity, we are able to show convergence of IRLS to minimizers of the nonlinear residual problem. For the case where we are lacking local convexity, we propose an appropriate convexification.. To illustrate the theoretical results we conclude the paper with several numerical experiments. We compare IRLS with standard Matlab functions for an easily presentable example and numerically validate our theoretical results in the more complicated framework of phase retrieval problems. Finally we examine the recovery capability of the algorithm in the context of data corrupted by impulsive noise where the sparsification of the residual is desired.
Recommendations
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- Iteratively reweighted least squares minimization for sparse recovery
- An iteratively reweighted least squares algorithm for sparse regularization
- Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming
- Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery
Cites work
- scientific article; zbMATH DE number 3915531 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 936298 (Why is no real title available?)
- A method for the solution of certain non-linear problems in least squares
- A predictor-corrector algorithm for the coupling of stiff ODEs to a particle population balance
- Accelerated Gauss-Newton algorithms for nonlinear least squares problems
- Algorithms for the Solution of the Nonlinear Least-Squares Problem
- An Adaptive Nonlinear Least-Squares Algorithm
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Atomic Decomposition by Basis Pursuit
- Bregman distances, totally convex functions, and a method for solving operator equations in Banach spaces
- Compressed sensing
- Convergence of an adaptive Kačanov FEM for quasi-linear problems
- Fast, robust total variation-based reconstruction of noisy, blurred images
- Generalized convexity, nonsmooth variational inequalities, and nonsmooth optimization
- Image recovery via total variation minimization and related problems
- Iteratively reweighted least squares minimization for sparse recovery
- Linearly constrained nonsmooth and nonconvex minimization
- Low-rank matrix recovery via iteratively reweighted least squares minimization
- Nonlinear total variation based noise removal algorithms
- On iteratively reweighted algorithms for nonsmooth nonconvex optimization in computer vision
- On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search
- Quasi-linear compressed sensing
- Rate of Convergence of Lawson's Algorithm
- Restricted isometry properties and nonconvex compressive sensing
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Semismooth Newton Methods for Operator Equations in Function Spaces
Cited in
(8)- A novel dictionary learning method based on total least squares approach with application in high dimensional biological data
- Regularization of geophysical ill-posed problems by iteratively re-weighted and refined least squares
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- An algorithm for real and complex rational minimax approximation
- Robustness by reweighting for kernel estimators: an overview
- An efficient code for the minimization of highly nonlinear and large residual least squares functions
- The method IRLs for some best \(\ell_p\) norm solutions of under- or overdetermined linear systems
- Efficient iterative solutions to complex-valued nonlinear least-squares problems with mixed linear and antilinear operators
This page was built for publication: Nonlinear residual minimization by iteratively reweighted least squares
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q301685)