Optimization theory and methods. Nonlinear programming

From MaRDI portal
Publication:2500511


DOI10.1007/b106451zbMath1129.90002MaRDI QIDQ2500511

Wen-Yu Sun, Ya-Xiang Yuan

Publication date: 17 August 2006

Published in: Springer Optimization and Its Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/b106451


90C30: Nonlinear programming

90C20: Quadratic programming

90C53: Methods of quasi-Newton type

90-01: Introductory exposition (textbooks, tutorial papers, etc.) pertaining to operations research and mathematical programming

90C52: Methods of reduced gradient type

90C55: Methods of successive quadratic programming type


Related Items

A feasible method for optimization with orthogonality constraints, A trust-region method with improved adaptive radius for systems of nonlinear equations, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, A hybrid optimization method for multiplicative noise and blur removal, Robust ridge estimator in restricted semiparametric regression models, Two new decomposition algorithms for training bound-constrained support vector machines, A nonmonotone trust region method based on simple conic models for unconstrained optimization, Trust region algorithm with two subproblems for bound constrained problems, A modified scaling parameter for the memoryless BFGS updating formula, A smoothing SQP framework for a class of composite \(L_q\) minimization over polyhedron, On Newton's method for the Fermat-Weber location problem, A Barzilai-Borwein conjugate gradient method, A cone constrained convex program: structure and algorithms, Two modified scaled nonlinear conjugate gradient methods, Two modifications of the method of the multiplicative parameters in descent gradient methods, Decentralized swarm coordination: a combined coverage/connectivity approach, A filter algorithm for nonlinear systems of equalities and inequalities, A note on ``A new iteration method for the matrix equation \(AX=B\), A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei, An alternating structured trust region algorithm for separable optimization problems with nonconvex constraints, A new nonlinear filter constructed from the Newton method and EPR in image restoration, On convergence analysis of a derivative-free trust region algorithm for constrained optimization with separable structure, Two modified three-term conjugate gradient methods with sufficient descent property, A new approximation of the matrix rank function and its application to matrix rank minimization, On the sufficient descent condition of the Hager-Zhang conjugate gradient methods, A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems, On the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimization, An adaptive conjugate gradient algorithm for large-scale unconstrained optimization, A combined SQP-IPM algorithm for solving large-scale nonlinear optimization problems, On solving L-SR1 trust-region subproblems, On optimality of two adaptive choices for the parameter of Dai-Liao method, A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update, Reconstructing local volatility using total variation, A modified three-term conjugate gradient method with sufficient descent property, A hybrid trust region algorithm for unconstrained optimization, A reduced Hessian algorithm with line search filter method for nonlinear programming, A nonmonotone globalization algorithm with preconditioned gradient path for unconstrained optimization, Nonmonotone second-order Wolfe's line search method for unconstrained optimization problems, An equivalency condition of nonsingularity in nonlinear semidefinite programming, An improved trust region algorithm for nonlinear equations, New quasi-Newton methods via higher order tensor models, An improved multi-step gradient-type method for large scale optimization, Two effective hybrid conjugate gradient algorithms based on modified BFGS updates, On Nesterov's nonsmooth Chebyshev-Rosenbrock functions, A feasible direction method for the semidefinite program with box constraints, Parameters estimation for a new anomalous thermal diffusion model in layered media, Cross-Hill: a heuristic method for global optimization, An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization, A quadratic penalty method for hypergraph matching, An adaptive scaled BFGS method for unconstrained optimization, A new subspace correction method for nonlinear unconstrained convex optimization problems, A filter-line-search method for unconstrained optimization, A modified nonmonotone trust region line search method, A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique, A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming, A nonmonotone supermemory gradient algorithm for unconstrained optimization, A subspace version of the Powell-Yuan trust-region algorithm for equality constrained optimization, Visual MISER: an efficient user-friendly visual program for solving optimal control problems, A framework of constraint preserving update schemes for optimization on Stiefel manifold, A modified BFGS algorithm based on a hybrid secant equation, The research on the properties of Fourier matrix and bent function, A type of modified BFGS algorithm with any rank defects and the local \(Q\)-superlinear convergence properties, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, Optimal radiation fractionation for low-grade gliomas: insights from a mathematical model, An accelerated double step size model in unconstrained optimization, A dwindling filter line search algorithm for nonlinear equality constrained optimization, Application of variable-fidelity models to aerodynamic optimization, Optimization design of an explicitly defined rack for the generation of rotors for twin-screw compressors, A short note on the Q-linear convergence of the steepest descent method, A dimension-reduced method of sensitivity analysis for stochastic user equilibrium assignment model, Two new conjugate gradient methods based on modified secant equations, A second-order pseudo-transient method for steady-state problems, A modified Newton's method for best rank-one approximation to tensors, A new method for parameter estimation of edge-preserving regularization in image restoration, Convergence of gradient method for Eelman networks, A filter trust region method for solving semi-infinite programming problems, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, A descent algorithm without line search for unconstrained optimization, On filter-successive linearization methods for nonlinear semidefinite programming, A seminorm regularized alternating least squares algorithm for canonical tensor decomposition, On the global convergence of a projective trust region algorithm for nonlinear equality constrained optimization, Error estimates for the simplified iteratively regularized Gauss-Newton method in Banach spaces under a Morozov-type stopping rule, Numerical treatment of nonlinear MHD Jeffery-Hamel problems using stochastic algorithms, Two accelerated nonmonotone adaptive trust region line search methods, Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme, A stochastic level-value estimation method for global optimization, A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, A regularized Newton method for computing ground states of Bose-Einstein condensates, Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems, On the worst-case evaluation complexity of non-monotone line search algorithms, Cubic interpolation: a line search technique for fuzzy optimization problems, Ensemble preconditioning for Markov chain Monte Carlo simulation, POD/DEIM reduced-order modeling of time-fractional partial differential equations with applications in parameter identification, A structured diagonal Hessian approximation method with evaluation complexity analysis for nonlinear least squares, A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization, Improved optimization methods for image registration problems, Accelerated double direction method for solving unconstrained optimization problems, Total variation image restoration method based on subspace optimization, A nonmonotone weighting self-adaptive trust region algorithm for unconstrained nonconvex optimization, Quasi-Newton methods for multiobjective optimization problems, A note on a multiplicative parameters gradient method, A high-order modified Levenberg-Marquardt method for systems of nonlinear equations with fourth-order convergence, A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, A mixture of nuclear norm and matrix factorization for tensor completion, A biobjective approach to recoverable robustness based on location planning, A new supermemory gradient method for unconstrained optimization problems, A new family of conjugate gradient methods for unconstrained optimization, Two adaptive Dai-Liao nonlinear conjugate gradient methods, Quadratic interpolation technique to minimize univariable fuzzy functions, A novel diffeomorphic model for image registration and its algorithm, A family of iterative methods for computing Moore-Penrose inverse of a matrix, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, On the sufficient descent property of the Shanno's conjugate gradient method, A perfect example for the BFGS method, Long range search for maximum likelihood in exponential families, A globally convergent filter-type trust region method for semidefinite programming, Asymptotic surrogate constraint method and its convergence for a class of semi-infinite programming, Simplified iteratively regularized Gauss-Newton method in Banach spaces under a general source condition, Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations, Hybridization of accelerated gradient descent method, Uniqueness and numerical scheme for the Robin coefficient identification of the time-fractional diffusion equation, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, Modified inexact Levenberg-Marquardt methods for solving nonlinear least squares problems, Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, A derivative-free trust-region algorithm for composite nonsmooth optimization, A nonmonotone PRP conjugate gradient method for solving square and under-determined systems of equations, Riemannian conjugate gradient methods with inverse retraction, A generalized worst-case complexity analysis for non-monotone line searches, Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions, Gauss-Newton-type methods for bilevel optimization, Explicit pseudo-transient continuation and the trust-region updating strategy for unconstrained optimization, An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing, Image restoration from noisy incomplete frequency data by alternative iteration scheme, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, A class of accelerated conjugate-gradient-like methods based on a modified secant equation, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A modified scaled memoryless symmetric rank-one method, Complex Golay pairs up to length 28: a search via computer algebra and programmatic SAT, Descent Perry conjugate gradient methods for systems of monotone nonlinear equations, A brief introduction to manifold optimization, A survey of gradient methods for solving nonlinear optimization, A QSC method for fractional subdiffusion equations with fractional boundary conditions and its application in parameters identification, A modified ODE-based algorithm for unconstrained optimization problems, A nonmonotone trust region method based on simple quadratic models, A new spectral method for \(l_1\)-regularized minimization, Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization, Essential issues on solving optimal power flow problems using soft-computing, A brief survey of methods for solving nonlinear least-squares problems, A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations, A class of derivative-free CG projection methods for nonsmooth equations with an application to the LASSO problem, A comparison of general-purpose optimization algorithms for finding optimal approximate experimental designs, Robust Schatten-\(p\) norm based approach for tensor completion, Convergence analyses on sparse feedforward neural networks via group lasso regularization, A heuristic approach to combat multicollinearity in least trimmed squares regression analysis, Numerical inversion of the fractional derivative index and surface thermal flux for an anomalous heat conduction model in a multi-layer medium, An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation, A stochastic trust region method for unconstrained optimization problems, Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization, Numerical construction of spherical \(t\)-designs by Barzilai-Borwein method, Profit-based churn prediction based on minimax probability machines, A derivative-free algorithm for spherically constrained optimization, An improved nonmonotone adaptive trust region method., Stable Lévy diffusion and related model fitting, A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function, Applying Gröbner basis method to multiparametric polynomial nonlinear programming, A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix, A novel self-adaptive trust region algorithm for unconstrained optimization, A new hybrid algorithm for convex nonlinear unconstrained optimization, Limited memory BFGS method based on a high-order tensor model, A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition, A modified two steps Levenberg-Marquardt method for nonlinear equations, A wedge trust region method with self-correcting geometry for derivative-free optimization, Structure learning of Bayesian networks using global optimization with applications in data classification, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Computing minimum norm solution of linear systems of equations by the generalized Newton method, A new simple model trust-region method with generalized Barzilai-Borwein parameter for large-scale optimization, A new robust line search technique based on Chebyshev polynomials, Numerical research on the sensitivity of nonmonotone trust region algorithms to their parameters, On the iteratively regularized Gauss-Newton method in Banach spaces with applications to parameter identification problems, Global convergence and the Powell singular function, A preconditioned descent algorithm for variational inequalities of the second kind involving the \(p\)-Laplacian operator, Trace-penalty minimization for large-scale eigenspace computation, A descent extension of the Polak-Ribière-Polyak conjugate gradient method, Robust time-domain output error method for identifying continuous-time systems with time delay, Some iterative methods for the largest positive definite solution to a class of nonlinear matrix equation, Maximum penalized likelihood estimation of additive hazards models with partly interval censoring, Convergence of a stabilized SQP method for equality constrained optimization, A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions, A modified trust region method with beale's PCG technique for optimization, Analysis on a superlinearly convergent augmented Lagrangian method, A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, Local convergence of quasi-Newton methods under metric regularity, An improved inversion-free method for solving the matrix equation \(X + A^\ast X^{-{\alpha}}A = Q\), Gradient methods for computing the Drazin-inverse solution, An unconstrained optimization method using nonmonotone second order Goldstein's line search, Minimizing a Symmetric Quasiconvex Function on a Two-Dimensional Lattice, The Fiedler Vector of a Laplacian Tensor for Hypergraph Partitioning, A New First-Order Algorithmic Framework for Optimization Problems with Orthogonality Constraints, Optimal Control, Dichotomy, and Closed Range, Feasible robust estimator in restricted semiparametric regression models based on the LTS approach, An adaptive nonmonotone trust region algorithm, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, Low rank updates in preconditioning the saddle point systems arising from data assimilation problems, Convergence analysis of simplified iteratively regularized Gauss–Newton method in a Banach space setting, Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization, MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD, On solving a class of linear semi-infinite programming by SDP method, MIN-MAX SOLUTIONS FOR PARAMETRIC CONTINUOUS STATIC GAME UNDER ROUGHNESS (PARAMETERS IN THE COST FUNCTION AND FEASIBLE REGION IS A ROUGH SET), Least-trimmed squares: asymptotic normality of robust estimator in semiparametric regression models, Orthogonal canonical correlation analysis and applications, Quantum circuit design for accurate simulation of qudit channels, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Extended least trimmed squares estimator in semiparametric regression models with correlated errors, Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints, Nonmonotone conic trust region method with line search technique for bound constrained optimization, An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems, Comments on “A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter”, Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, Accelerating the modified Levenberg-Marquardt method for nonlinear equations, On the Local and Superlinear Convergence of a Parameterized DFP Method, Inhomogeneous polynomial optimization over a convex set: An approximation approach, A dwindling filter line search method for unconstrained optimization, A Filter Active-Set Algorithm for Ball/Sphere Constrained Optimization Problem, A descent family of Dai–Liao conjugate gradient methods, Accelerated gradient descent methods with line search, Calculating the normalising constant of the Bingham distribution on the sphere using the holonomic gradient method, Complexity and performance of an Augmented Lagrangian algorithm, Gravity-magnetic cross-gradient joint inversion by the cyclic gradient method, New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters, An infeasible interior-point algorithm with full-Newton step for linear optimization, Numerical multilinear algebra and its applications, Successive unconstrained dual optimization method for~rank-one approximation to tensors, An ODE-based nonmonotone method for unconstrained optimization problems, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A parallel line search subspace correction method for composite convex optimization, A new method of moving asymptotes for large-scale unconstrained optimization, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, A memory gradient method based on the nonmonotone technique, An efficient inexact Newton-CG algorithm for the smallest enclosing ball problem of large dimensions, Mollifier smoothing of \(C^0\)-Finsler structures, A subspace version of the Wang-Yuan augmented Lagrangian-trust region method for equality constrained optimization, A quasi fractional order gradient descent method with adaptive stepsize and its application in system identification, A new fractional Chebyshev FDM: an application for solving the fractional differential equations generated by optimisation problem, Fast Finite Difference Approximation for Identifying Parameters in a Two-dimensional Space-fractional Nonlocal Model with Variable Diffusivity Coefficients, On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications, A descent hybrid modification of the Polak–Ribière–Polyak conjugate gradient method, A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, AN ADAPTIVE CONJUGACY CONDITION AND RELATED NONLINEAR CONJUGATE GRADIENT METHODS, Computer Algebra and Line Search, The modified Levenberg-Marquardt method for nonlinear equations with cubic convergence, NEW ADAPTIVE BARZILAI–BORWEIN STEP SIZE AND ITS APPLICATION IN SOLVING LARGE-SCALE OPTIMIZATION PROBLEMS, The d-Level Nested Logit Model: Assortment and Price Optimization Problems, Two optimal Dai–Liao conjugate gradient methods, The Action Gambler and Equal-Sized Wagering


Uses Software