A structured L-BFGS method and its application to inverse problems
From MaRDI portal
Publication:6557640
DOI10.1088/1361-6420/AD2C31zbMATH Open1548.90401MaRDI QIDQ6557640FDOQ6557640
Authors: Florian Mannel, Hari Om Aggrawal, Jan Modersitzki
Publication date: 18 June 2024
Published in: Inverse Problems (Search for Journal in Brave)
inverse problemsnon-convex optimizationmedical imagingimage registrationseed matrixKurdyka-Łojasiewicz conditionstructured L-BFGS
Cites Work
- FAIR. Flexible algorithms for image registration.
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Updating Quasi-Newton Matrices with Limited Storage
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Two-Point Step Size Gradient Methods
- Line search algorithms with guaranteed sufficient decrease
- Some numerical experiments with variable-storage quasi-Newton algorithms
- A coordinate gradient descent method for nonsmooth separable minimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Representations of quasi-Newton matrices and their use in limited memory methods
- Least Change Secant Updates for Quasi-Newton Methods
- Computational Methods for Inverse Problems
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Solution of Sparse Indefinite Systems of Linear Equations
- A modified BFGS method and its global convergence in nonconvex minimization
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global convergence of a modified limited memory BFGS method for non-convex minimization
- Discrete inverse problems. Insight and algorithms.
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Global convergence of a new hybrid Gauss-Newton structured BFGS method for nonlinear least squares problems
- An Adaptive Nonlinear Least-Squares Algorithm
- A unified approach to fast image registration and a new curvature based registration technique.
- A hyperelastic regularization energy for image registration
- A Proximal Minimization Algorithm for Structured Nonconvex and Nonsmooth Problems
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- Faster Independent Component Analysis by Preconditioning With Hessian Approximations
- Quasi-Newton methods for large-scale electromagnetic inverse problems
- Convergence Theorems for Least-Change Secant Update Methods
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- On efficiently combining limited-memory and trust-region techniques
- Global convergence of Gauss-Newton-MBFGS method for solving the nonlinear least squares problem
- Title not available (Why is that?)
- Title not available (Why is that?)
- Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization
- Perspectives on self-scaling variable metric algorithms
- The global convergence of a modified BFGS method for nonconvex functions
- A matrix-free approach to parallel and memory-efficient deformable image registration
- Hessian initialization strategies for \(\ell \)-BFGS solving non-linear inverse problems
- Convergence of Broyden’s Method in Banach Spaces
- A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations
- A regularized limited memory BFGS method for nonconvex unconstrained minimization
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- Least-Change Sparse Secant Update Methods with Inaccurate Secant Conditions
- Numerical approximation of partial differential equations
- A family of variable metric methods in function space, without exact line searches
- An Infinite-Dimensional Convergence Theory for Reduced SQP Methods in Hilbert Space
- A Kantorovich theorem for the structured PSB update in Hilbert space.
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- Structured two-point stepsize gradient methods for nonlinear least squares
- Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
- A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems
- Compact representations of structured BFGS matrices
- Solving nonlinear systems of equations via spectral residual methods: stepsize selection and applications
- Adaptive scaling damped BFGS method without gradient Lipschitz continuity
- A hybrid semismooth quasi-Newton method for nonsmooth optimal control with PDEs
- Stabilized Barzilai-Borwein method
- The projection technique for two open problems of unconstrained optimization problems
- Fast spherical centroidal Voronoi mesh generation: a Lloyd-preconditioned LBFGS method in parallel
- Convergence of Newton-MR under inexact Hessian information
- Hybrid limited memory gradient projection methods for box-constrained optimization problems
- On the Derivation of Quasi-Newton Formulas for Optimization in Function Spaces
- Title not available (Why is that?)
- A noise-tolerant quasi-Newton algorithm for unconstrained optimization
- Regularization of limited memory quasi-Newton methods for large-scale nonconvex minimization
- Quasi-Newton methods for machine learning: forget the past, just sample
- On the convergence and mesh-independent property of the Barzilai-Borwein method for PDE-constrained optimization
- Newton-MR: inexact Newton method with minimum residual sub-problem solver
- Extrapolated Proximal Subgradient Algorithms for Nonconvex and Nonsmooth Fractional Programs
- Convergence of Random Reshuffling under the Kurdyka–Łojasiewicz Inequality
Cited In (1)
This page was built for publication: A structured L-BFGS method and its application to inverse problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6557640)