On convergence of the Gauss-Newton method for convex composite optimization.
From MaRDI portal
Publication:5957569
DOI10.1007/s101070100249zbMath1049.90132OpenAlexW2082725151MaRDI QIDQ5957569
Publication date: 2002
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s101070100249
Related Items
On iterative computation of fixed points and optimization ⋮ Sharp minima for multiobjective optimization in Banach spaces ⋮ Composite proximal bundle method ⋮ Convergence analysis of the Gauss-Newton-type method for Lipschitz-like mappings ⋮ The equivalence of three types of error bounds for weakly and approximately convex functions ⋮ Convergence analysis of a proximal Gauss-Newton method ⋮ Generalized weak sharp minima in cone-constrained convex optimization with applications ⋮ The multiproximal linearization method for convex composite problems ⋮ Strong KKT conditions and weak sharp solutions in convex-composite optimization ⋮ Riemannian linearized proximal algorithms for nonnegative inverse eigenvalue problem ⋮ Convergence of the Gauss-Newton method for convex composite optimization problems under majorant condition on Riemannian manifolds ⋮ Linearized proximal algorithms with adaptive stepsizes for convex composite optimization with applications ⋮ The value function approach to convergence analysis in composite optimization ⋮ Gauss-Newton method for convex composite optimizations on Riemannian manifolds ⋮ Relaxed Gauss--Newton Methods with Applications to Electrical Impedance Tomography ⋮ Convergence analysis of the Gauss-Newton method for convex inclusion and convex-composite optimization problems ⋮ Expanding the applicability of the Gauss-Newton method for convex optimization under a majorant condition ⋮ Modified inexact Levenberg-Marquardt methods for solving nonlinear least squares problems ⋮ Generalized weak sharp minima in cone-constrained convex optimization on Hadamard manifolds ⋮ On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications ⋮ Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method