minpack
From MaRDI portal
Software:17450
swMATH5310MaRDI QIDQ17450FDOQ17450
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- \textsc{Gibbs2}: A new version of the quasi-harmonic model code. I. Robust treatment of the static data
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A trust-region method with improved adaptive radius for systems of nonlinear equations
- An efficient nonmonotone trust-region method for unconstrained optimization
- A BFGS trust-region method for nonlinear equations
- Limited memory BFGS method with backtracking for symmetric nonlinear equations
- Title not available (Why is that?)
- A Subspace Study on Conjugate Gradient Algorithms
- Modifications of Newton's method to extend the convergence domain
- A conjugate gradient method with descent direction for unconstrained optimization
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A nonmonotone trust region method based on simple conic models for unconstrained optimization
- On restart procedures for the conjugate gradient method
- The convergence of conjugate gradient method with nonmonotone line search
- A nonmonotone trust region method with adaptive radius for unconstrained optimization problems
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- An improved trust region algorithm for nonlinear equations
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- A shooting algorithm for optimal control problems with singular Arcs
- A truncated nonmonotone Gauss-Newton method for large-scale nonlinear least-squares problems
- A trust-region approach with novel filter adaptive radius for system of nonlinear equations
- A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems
- A modified PRP conjugate gradient method
- A new backtracking inexact BFGS method for symmetric nonlinear equations
- A hybrid trust region algorithm for unconstrained optimization
- Convergence rate of the trust region method for nonlinear equations under local error bound condition
- A nonmonotone adaptive trust region method for unconstrained optimization based on conic model
- Nonmonotone trust region algorithm for unconstrained optimization problems
- Direct analytical solution of a modified form of the meshing equations in two dimensions for non-conjugate gear contact
- Recognizing underlying sparsity in optimization
- CARTopt: a random search method for nonsmooth unconstrained optimization
- Scaled memoryless symmetric rank one method for large-scale optimization
- Optimal positioning of anodes and virtual sources in the design of cathodic protection systems using the method of fundamental solutions
- Tensor methods for large sparse systems of nonlinear equations
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- A class of nonmonotone stabilization trust region methods
- A nonmonotone trust region method for unconstrained optimization
- A new direct search method based on separable fractional interpolation model
- A class of parameter-free filled functions for box-constrained system of nonlinear equations
- A self-adaptive trust region method with line search based on a simple subproblem model
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A smoothing Newton method with Fischer-Burmeister function for second-order cone complementarity problems
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Combining trust-region techniques and Rosenbrock methods to compute stationary points
- A conic trust-region method and its convergence properties
- Optimization theory and methods. Nonlinear programming
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
- A new adaptive trust-region method for system of nonlinear equations
- Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization
- A conjugate gradient method for unconstrained optimization problems
- Algorithm 768: TENSOLVE
- A quasi-Gauss-Newton method for solving nonlinear algebraic equations
- On the worst-case evaluation complexity of non-monotone line search algorithms
- A nonmonotone second-order steplength method for unconstrained minimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Title not available (Why is that?)
- Title not available (Why is that?)
- Algorithm 829
- The convergence of quasi-Gauss-Newton methods for nonlinear problems
- Inexact Newton methods for solving nonsmooth equations
- A new hybrid stochastic approximation algorithm
- A modified hybrid conjugate gradient method for unconstrained optimization
- An unconstrained optimization method using nonmonotone second order Goldstein's line search
- QRAP: a numerical code for projected (Q)uasiparticle (RA)ndom (P)hase approximation
- A nonmonotone trust region method based on nonincreasing technique of weighted average of the successive function values
- New quasi-Newton methods for unconstrained optimization problems
- An effective trust-region-based approach for symmetric nonlinear systems
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A variant of trust-region methods for unconstrained optimization
- An adaptive approach of conic trust-region method for unconstrained optimization problems
- COCO: a platform for comparing continuous optimizers in a black-box setting
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- One side cut accelerated random search
- SDELab: A package for solving stochastic differential equations in MATLAB
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Bayesian inference for nonlinear structural time series models
- A modified projected conjugate gradient algorithm for unconstrained optimization problems
- MERLIN-3. 0. A multidimensional optimization environment
- A cover partitioning method for bound constrained global optimization
- A survey of applications of the MFS to inverse problems
- A gentle introduction to Numerica
- On the limited memory BFGS method for large scale optimization
- Design and Analysis of Optimization Algorithms Using Computational Statistics
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- The TOMLAB NLPLIB toolbox for nonlinear programming
- Higher Order Predictors and Adaptive Steplength Control in Path Following Algorithms
- Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables
- EFCOSS: an interactive environment facilitating optimal experimental design
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- OPTAC: A portable software package for analyzing and comparing optimization methods by visualization
- Constrained dogleg methods for nonlinear systems with simple bounds
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A coordinate gradient descent method for nonsmooth separable minimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Minimum time control of the rocket attitude reorientation associated with orbit dynamics
- An evaluation of back-propagation neural networks for the optimal design of structural systems. I: Training procedures
- An evaluation of back-propagation neural networks for the optimal design of structural systems. II: Numerical evaluation
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A hybrid conjugate gradient method with descent property for unconstrained optimization
This page was built for software: minpack