scientific article

From MaRDI portal
Publication:3539529

zbMath1161.90486MaRDI QIDQ3539529

Neculai Andrei

Publication date: 18 November 2008

Full work available at URL: http://www.ici.ro/camo/journal/v10n1.htm

Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items

An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems, Efficient tridiagonal preconditioner for the matrix-free truncated Newton method, Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization, New hybrid conjugate gradient method for unconstrained optimization, Spectral method and its application to the conjugate gradient method, Scaling damped limited-memory updates for unconstrained optimization, A new adaptive trust region algorithm for optimization problems, Dynamic scaling on the limited memory BFGS method, Best practices for comparing optimization algorithms, The inexact-Newton via GMRES subspace method without line search technique for solving symmetric nonlinear equations, Projected affine-scaling interior-point Newton's method with line search filter for box constrained optimization, A new adaptive Barzilai and Borwein method for unconstrained optimization, On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems, An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search, A transformation of accelerated double step size method for unconstrained optimization, A spectral dai-yuan-type conjugate gradient method for unconstrained optimization, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search, Scaling on the spectral gradient method, Scaled diagonal gradient-type method with extra update for large-scale unconstrained optimization, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A new two-step gradient-type method for large-scale unconstrained optimization, A nonmonotone line search slackness technique for unconstrained optimization, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, A new variant of the memory gradient method for unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, The convergence rate of a restart MFR conjugate gradient method with inexact line search, Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems, On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians, An adaptive trust region method based on simple conic models, An efficient nonmonotone trust-region method for unconstrained optimization, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, Monotone and nonmonotone trust-region-based algorithms for large scale unconstrained optimization problems, New quasi-Newton methods via higher order tensor models, Improved Hessian approximation with modified secant equations for symmetric rank-one method, An accelerated double step size model in unconstrained optimization, \(n\)-step quadratic convergence of the MPRP method with a restart strategy, Two modifications of the method of the multiplicative parameters in descent gradient methods, An improved multi-step gradient-type method for large scale optimization, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, Accelerated double direction method for solving unconstrained optimization problems, A symmetric rank-one method based on extra updating techniques for unconstrained optimization, Hybrid modification of accelerated double direction method, A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, A nonmonotone weighting self-adaptive trust region algorithm for unconstrained nonconvex optimization, The hybrid BFGS-CG method in solving unconstrained optimization problems, A new method with sufficient descent property for unconstrained optimization, Scalar correction method for solving large scale unconstrained minimization problems, Descent line search scheme using Geršgorin circle theorem, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, A note on a multiplicative parameters gradient method, A partitioned PSB method for partially separable unconstrained optimization problems, A nonmonotone filter line search technique for the MBFGS method in unconstrained optimization, A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, Cross-Hill: a heuristic method for global optimization, Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence, Conjugate gradient path method without line search technique for derivative-free unconstrained optimization, A class of diagonal quasi-Newton methods for large-scale convex minimization, An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, Global convergence property of scaled two-step BFGS method, Hybridization of accelerated gradient descent method, An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization, An adaptive scaled BFGS method for unconstrained optimization, A new trust-region method for solving systems of equalities and inequalities, Global optimization test problems based on random field composition, A hybrid ODE-based method for unconstrained optimization problems, Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem, Two new conjugate gradient methods based on modified secant equations, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A nonmonotone PRP conjugate gradient method for solving square and under-determined systems of equations, A filter-line-search method for unconstrained optimization, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, Some three-term conjugate gradient methods with the inexact line search condition, On three-term conjugate gradient algorithms for unconstrained optimization, A spectral three-term Hestenes-Stiefel conjugate gradient method, A nonmonotone supermemory gradient algorithm for unconstrained optimization, A conjugate gradient method with sufficient descent property, Two nonmonotone trust region algorithms based on an improved Newton method, A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems, A two-step improved Newton method to solve convex unconstrained optimization problems, Two limited-memory optimization methods with minimum violation of the previous secant conditions, Adaptive scaling damped BFGS method without gradient Lipschitz continuity, On the bang-bang control approach via a component-wise line search strategy for unconstrained optimization, A new gradient method via quasi-Cauchy relation which guarantees descent, Hybrid conjugate gradient algorithm for unconstrained optimization, Acceleration of conjugate gradient algorithms for unconstrained optimization, Least-squares-based three-term conjugate gradient methods, A modified bat algorithm with conjugate gradient method for global optimization, An improvement of adaptive cubic regularization method for unconstrained optimization problems, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems, A memory gradient method based on the nonmonotone technique, A scaled three-term conjugate gradient method for unconstrained optimization, A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization, Two globally convergent nonmonotone trust-region methods for unconstrained optimization, New hybrid conjugate gradient method as a convex combination of LS and FR methods, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A new family of hybrid three-term conjugate gradient methods with applications in image restoration, A frame-based conjugate gradients direct search method with radial basis function interpolation model, The global convergence of a modified BFGS method for nonconvex functions, Hybrid nonmonotone spectral gradient method for the unconstrained minimization problem, Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization, Global convergence of a modified conjugate gradient method, A method for global minimization of functions using the Krawczyk operator, Globally convergence of nonlinear conjugate gradient method for unconstrained optimization, Two new conjugate gradient methods for unconstrained optimization, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization, A novel hybrid trust region algorithm based on nonmonotone and LOOCV techniques, A diagonal quasi-Newton updating method for unconstrained optimization, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, New conjugate gradient method for unconstrained optimization, Gradient method with multiple damping for large-scale unconstrained optimization, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions, A New Hybrid Optimization Algorithm for the Estimation of Archie Parameters, The projection technique for two open problems of unconstrained optimization problems, A nonmonotone trust-region line search method for large-scale unconstrained optimization, A note on hybridization process applied on transformed double step size model, A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property, A spectral conjugate gradient method for solving large-scale unconstrained optimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems, A harmonic framework for stepsize selection in gradient methods, Nonlinear optimisation using directional step lengths based on RPROP, A conjugate directions approach to improve the limited-memory BFGS method, A survey of gradient methods for solving nonlinear optimization, An inexact line search approach using modified nonmonotone strategy for unconstrained optimization, Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm, A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, Accelerated diagonal gradient-type method for large-scale unconstrained optimization, A Modified Coordinate Search Method Based on Axes Rotation, A class of accelerated subspace minimization conjugate gradient methods, Block BFGS Methods, A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction, A regularized limited memory BFGS method for nonconvex unconstrained minimization, Another hybrid conjugate gradient algorithm for unconstrained optimization, Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems, INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS, The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search, New hybrid conjugate gradient method as a convex combination of LS and CD methods, A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition, Two modified DY conjugate gradient methods for unconstrained optimization problems, Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization, New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization, Some three-term conjugate gradient methods with the new direction structure, A class of globally convergent three-term Dai-Liao conjugate gradient methods, Computer Algebra and Line Search, An improved nonmonotone adaptive trust region method., The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, A modified spectral conjugate gradient method with global convergence, A Derivative-Free Method for Structured Optimization Problems, A new steepest descent method with global convergence properties, A modification of classical conjugate gradient method using strong Wolfe line search, A modified form of conjugate gradient method for unconstrained optimization problems, A new classical conjugate gradient coefficient with exact line search, A new type of descent conjugate gradient method with exact line search, Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization, Two-phase quasi-Newton method for unconstrained optimization problem, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, Two improved nonlinear conjugate gradient methods with the strong Wolfe line search, An improved hybrid-ORBIT algorithm based on point sorting and MLE technique, NEW ADAPTIVE BARZILAI–BORWEIN STEP SIZE AND ITS APPLICATION IN SOLVING LARGE-SCALE OPTIMIZATION PROBLEMS, A three-term conjugate gradient method with accelerated subspace quadratic optimization, A novel self-adaptive trust region algorithm for unconstrained optimization, A new hybrid algorithm for convex nonlinear unconstrained optimization, MULTIPLE USE OF BACKTRACKING LINE SEARCH IN UNCONSTRAINED OPTIMIZATION, Limited memory BFGS method based on a high-order tensor model, A nonmonotone hybrid conjugate gradient method for unconstrained optimization, An Efficient Mixed Conjugate Gradient Method for Solving Unconstrained Optimisation Problems, Two classes of spectral conjugate gradient methods for unconstrained optimizations, Two spectral conjugate gradient methods for unconstrained optimization problems, An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems, Unnamed Item, A relaxed nonmonotone adaptive trust region method for solving unconstrained optimization problems, An interior point parameterized central path following algorithm for linearly constrained convex programming, An affine scaling interior trust-region method combining with nonmonotone line search filter technique for linear inequality constrained minimization, Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization, Correction of trust region method with a new modified Newton method, A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Research on three-step accelerated gradient algorithm in deep learning, Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization, Unnamed Item, Unnamed Item, Hybridization rule applied on accelerated double step size optimization scheme, Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising, A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations, Unnamed Item, A homogeneous Rayleigh quotient with applications in gradient methods, An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems, A nonlinear conjugate gradient method using inexact first-order information, Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search, Two diagonal conjugate gradient like methods for unconstrained optimization, Gradient-based descent linesearch to solve interval-valued optimization problems under gH-differentiability with application to finance, Two families of hybrid conjugate gradient methods with restart procedures and their applications, A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization, A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems, A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems, Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization, A hybrid BB-type method for solving large scale unconstrained optimization, A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems, Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing, A fuzzy particle swarm optimization method with application to shape design problem, A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems, Collinear gradients method for minimizing smooth functions. Optimality conditions and algorithms, A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems, Global convergence properties of the BBB conjugate gradient method, Worst-case evaluation complexity of a derivative-free quadratic regularization method, An efficient new hybrid CG-method as convex combination of DY and CD and HS algorithms, The regularization continuation method for optimization problems with nonlinear equality constraints, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, Unnamed Item, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives, Unnamed Item, Unnamed Item, Unnamed Item, Accelerated gradient descent methods with line search, On the final steps of Newton and higher order methods, Unnamed Item, An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimisation, Unnamed Item, Accelerated multiple step-size methods for solving unconstrained optimization problems


Uses Software