New line search methods for unconstrained optimization
From MaRDI portal
Publication:2510603
DOI10.1016/j.jkss.2008.05.004zbMath1293.65098OpenAlexW2043806396MaRDI QIDQ2510603
Publication date: 1 August 2014
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2008.05.004
unconstrained optimizationglobal convergenceprobability\(\mathbb R\)-linear convergenceline search method
Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10)
Related Items
Non Monotone Backtracking Inexact BFGS Method for Regression Analysis ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ A new class of nonlinear conjugate gradient coefficients with global convergence properties ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ A mixed spectral CD-DY conjugate gradient method ⋮ Global convergence of a spectral conjugate gradient method for unconstrained optimization ⋮ A Parallel Dual Matrix Method for Blind Signal Separation ⋮ A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches ⋮ A quasi-Newton algorithm for large-scale nonlinear equations ⋮ A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems ⋮ A BFGS algorithm for solving symmetric nonlinear equations ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search ⋮ A conjugate gradient method for unconstrained optimization problems ⋮ Convergence analysis of a modified BFGS method on convex minimizations ⋮ A New Method with Descent Property for Symmetric Nonlinear Equations ⋮ Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization ⋮ A conjugate gradient method with descent direction for unconstrained optimization ⋮ An improved trust region method for unconstrained optimization ⋮ A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- Differential optimization techniques
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- Semiparametric log-linear regression for longitudinal measurements subject to outcome-depen\-dent follow-up
- New quasi-Newton equation and related methods for unconstrained optimization
- A note on minimization problems and multistep methods
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Nonmonotone adaptive trust-region method for unconstrained optimization problems
- Local convergence analysis for partitioned quasi-Newton updates
- Convergence of line search methods for unconstrained optimization
- Introduction to global optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- New quasi-Newton methods for unconstrained optimization problems
- A New Matrix-Free Algorithm for the Large-Scale Trust-Region Subproblem
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Family of Trust-Region-Based Algorithms for Unconstrained Minimization with Strong Global Convergence Properties
- A Trust Region Algorithm for Equality Constrained Minimization: Convergence Properties and Implementation
- A Trust Region Algorithm for Nonlinearly Constrained Optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Quasi-Newton Methods, Motivation and Theory
- Numerical Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Semiparametric and Nonparametric Regression Analysis of Longitudinal Data
- One-step and multistep procedures for constrained minimization problems
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- CUTEr and SifDec
- A New Algorithm for Unconstrained Optimization
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method