Publication:3539529

From MaRDI portal


zbMath1161.90486MaRDI QIDQ3539529

Neculai Andrei

Publication date: 18 November 2008

Full work available at URL: http://www.ici.ro/camo/journal/v10n1.htm


90C30: Nonlinear programming


Related Items

On three-term conjugate gradient algorithms for unconstrained optimization, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Efficient tridiagonal preconditioner for the matrix-free truncated Newton method, Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization, New hybrid conjugate gradient method for unconstrained optimization, Spectral method and its application to the conjugate gradient method, Scaling damped limited-memory updates for unconstrained optimization, Dynamic scaling on the limited memory BFGS method, The inexact-Newton via GMRES subspace method without line search technique for solving symmetric nonlinear equations, On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, Scaling on the spectral gradient method, Scaled diagonal gradient-type method with extra update for large-scale unconstrained optimization, A nonmonotone line search slackness technique for unconstrained optimization, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, The convergence rate of a restart MFR conjugate gradient method with inexact line search, On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians, An efficient nonmonotone trust-region method for unconstrained optimization, Two modifications of the method of the multiplicative parameters in descent gradient methods, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, A nonmonotone filter line search technique for the MBFGS method in unconstrained optimization, A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems, Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence, Conjugate gradient path method without line search technique for derivative-free unconstrained optimization, A class of diagonal quasi-Newton methods for large-scale convex minimization, A new trust-region method for solving systems of equalities and inequalities, Global optimization test problems based on random field composition, A new two-step gradient-type method for large-scale unconstrained optimization, New quasi-Newton methods via higher order tensor models, Improved Hessian approximation with modified secant equations for symmetric rank-one method, \(n\)-step quadratic convergence of the MPRP method with a restart strategy, An improved multi-step gradient-type method for large scale optimization, A symmetric rank-one method based on extra updating techniques for unconstrained optimization, Scalar correction method for solving large scale unconstrained minimization problems, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, Cross-Hill: a heuristic method for global optimization, An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization, An adaptive scaled BFGS method for unconstrained optimization, A hybrid ODE-based method for unconstrained optimization problems, A filter-line-search method for unconstrained optimization, Some three-term conjugate gradient methods with the inexact line search condition, A nonmonotone supermemory gradient algorithm for unconstrained optimization, A conjugate gradient method with sufficient descent property, A modified bat algorithm with conjugate gradient method for global optimization, An adaptive trust region method based on simple conic models, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, An accelerated double step size model in unconstrained optimization, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, Two new conjugate gradient methods based on modified secant equations, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, A new gradient method via quasi-Cauchy relation which guarantees descent, Hybrid conjugate gradient algorithm for unconstrained optimization, Acceleration of conjugate gradient algorithms for unconstrained optimization, A new adaptive trust region algorithm for optimization problems, Best practices for comparing optimization algorithms, Projected affine-scaling interior-point Newton's method with line search filter for box constrained optimization, A new adaptive Barzilai and Borwein method for unconstrained optimization, An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search, A transformation of accelerated double step size method for unconstrained optimization, A spectral dai-yuan-type conjugate gradient method for unconstrained optimization, Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions, Accelerated double direction method for solving unconstrained optimization problems, Hybrid modification of accelerated double direction method, A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, A nonmonotone weighting self-adaptive trust region algorithm for unconstrained nonconvex optimization, The hybrid BFGS-CG method in solving unconstrained optimization problems, A new method with sufficient descent property for unconstrained optimization, Descent line search scheme using Geršgorin circle theorem, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, A note on a multiplicative parameters gradient method, A partitioned PSB method for partially separable unconstrained optimization problems, An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, Global convergence property of scaled two-step BFGS method, Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization, A new variant of the memory gradient method for unconstrained optimization, Monotone and nonmonotone trust-region-based algorithms for large scale unconstrained optimization problems, Hybridization of accelerated gradient descent method, A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A nonmonotone PRP conjugate gradient method for solving square and under-determined systems of equations, A spectral three-term Hestenes-Stiefel conjugate gradient method, A method for global minimization of functions using the Krawczyk operator, Two new conjugate gradient methods for unconstrained optimization, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions, The projection technique for two open problems of unconstrained optimization problems, A note on hybridization process applied on transformed double step size model, Unnamed Item, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, Globally convergence of nonlinear conjugate gradient method for unconstrained optimization, A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, Block BFGS Methods, INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS, A Derivative-Free Method for Structured Optimization Problems, A new steepest descent method with global convergence properties, A modification of classical conjugate gradient method using strong Wolfe line search, A modified form of conjugate gradient method for unconstrained optimization problems, A new classical conjugate gradient coefficient with exact line search, A new type of descent conjugate gradient method with exact line search, Unnamed Item, Unnamed Item, Unnamed Item, Global convergence properties of the BBB conjugate gradient method, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Accelerated gradient descent methods with line search, On the final steps of Newton and higher order methods, A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property, A spectral conjugate gradient method for solving large-scale unconstrained optimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems, A survey of gradient methods for solving nonlinear optimization, Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, Accelerated diagonal gradient-type method for large-scale unconstrained optimization, A regularized limited memory BFGS method for nonconvex unconstrained minimization, Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems, The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search, Two modified DY conjugate gradient methods for unconstrained optimization problems, Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization, New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization, Some three-term conjugate gradient methods with the new direction structure, A class of globally convergent three-term Dai-Liao conjugate gradient methods, An improved nonmonotone adaptive trust region method., The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, A modified spectral conjugate gradient method with global convergence, Two-phase quasi-Newton method for unconstrained optimization problem, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, An improved hybrid-ORBIT algorithm based on point sorting and MLE technique, A novel self-adaptive trust region algorithm for unconstrained optimization, A new hybrid algorithm for convex nonlinear unconstrained optimization, Limited memory BFGS method based on a high-order tensor model, A nonmonotone hybrid conjugate gradient method for unconstrained optimization, A relaxed nonmonotone adaptive trust region method for solving unconstrained optimization problems, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, A scaled three-term conjugate gradient method for unconstrained optimization, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, A frame-based conjugate gradients direct search method with radial basis function interpolation model, The global convergence of a modified BFGS method for nonconvex functions, Hybrid nonmonotone spectral gradient method for the unconstrained minimization problem, Global convergence of a modified conjugate gradient method, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, A novel hybrid trust region algorithm based on nonmonotone and LOOCV techniques, A diagonal quasi-Newton updating method for unconstrained optimization, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, Gradient method with multiple damping for large-scale unconstrained optimization, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, A nonmonotone trust-region line search method for large-scale unconstrained optimization, Nonlinear optimisation using directional step lengths based on RPROP, A conjugate directions approach to improve the limited-memory BFGS method, An inexact line search approach using modified nonmonotone strategy for unconstrained optimization, A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, Another hybrid conjugate gradient algorithm for unconstrained optimization, A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition, A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems, A memory gradient method based on the nonmonotone technique, Two globally convergent nonmonotone trust-region methods for unconstrained optimization, New conjugate gradient method for unconstrained optimization, A New Hybrid Optimization Algorithm for the Estimation of Archie Parameters, Computer Algebra and Line Search, NEW ADAPTIVE BARZILAI–BORWEIN STEP SIZE AND ITS APPLICATION IN SOLVING LARGE-SCALE OPTIMIZATION PROBLEMS, A Modified Coordinate Search Method Based on Axes Rotation


Uses Software