MINPACK-2
From MaRDI portal
Software:16919
swMATH4751MaRDI QIDQ16919FDOQ16919
Author name not available (Why is that?)
Cited In (59)
- TIME-PARALLEL COMPUTATION OF PSEUDO-ADJOINTS FOR A LEAPFROG SCHEME
- Title not available (Why is that?)
- On sequential and parallel non-monotone derivative-free algorithms for box constrained optimization
- Adaptive sequencing of primal, dual, and design steps in simulation based optimization
- Title not available (Why is that?)
- A family of stochastic programming test problems based on a model for tactical manpower planning
- Benchmarking nonlinear optimization software in technical computing environments
- A reduced proximal-point homotopy method for large-scale non-convex BQP
- The Efficient Computation of Sparse Jacobian Matrices Using Automatic Differentiation
- Triangular decomposition of CP factors of a third-order tensor with application to solving nonlinear systems of equations
- Tensor Methods for Large, Sparse Nonlinear Least Squares Problems
- Implementation of Partial Separability in a Source-to-Source Transformation AD Tool
- Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
- Tensor methods for large sparse systems of nonlinear equations
- ADiJaC -- Automatic Differentiation of Java Classfiles
- PAL-Hom method for QP and an application to LP
- A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
- Efficient Derivative Codes through Automatic Differentiation and Interface Contraction: An Application in Biostatistics
- Source Transformation for MATLAB Automatic Differentiation
- Recent Advances in Bound Constrained Optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Inexact Block Jacobi--Broyden Methods for Solving Nonlinear Systems of Equations
- Optimality-preserving elimination of linearities in Jacobian accumulation
- Best practices for comparing optimization algorithms
- Algorithm 984
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- How difficult is nonlinear optimization? A practical solver tuning approach, with illustrative results
- Adapting derivative free optimization methods to engineering models with discrete variables
- Flexible complementarity solvers for large-scale applications
- Sensitivity Analysis and Computations of the Time Relaxation Model
- Numerical methods for optimization problems in water flow and reactive solute transport processes of xenobiotics in soils
- A computational study of global optimization solvers on two trust region subproblems
- Implementation of sparse forward mode automatic differentiation with application to electromagnetic shape optimization
- Title not available (Why is that?)
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- Computing Large Sparse Jacobian Matrices Using Automatic Differentiation
- An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization
- Stopping rules and backward error analysis for bound-constrained optimization
- Nonlinear optimization applications using the GAMS technology
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization
- Title not available (Why is that?)
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- On three-term conjugate gradient algorithms for unconstrained optimization
- A New Active Set Algorithm for Box Constrained Optimization
- Compact sparse symbolic Jacobian computation in large systems of ODEs
- Hierarchical approaches to automatic differentiation
- Efficient computation of gradients and Jacobians by dynamic exploitation of sparsity in automatic differentiation
- Computing Gradients in Large-Scale Optimization Using Automatic Differentiation
- Evaluation of Large-scale Optimization Problems on Vector and Parallel Architectures
- Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- A globally convergent inexact Newton method with a new choice for the forcing term
- Comparative assessment of algorithms and software for global optimization
- Jacobian code generated by source transformation and vertex elimination can be as efficient as hand-coding
- Automatic differentiation and spectral projected gradient methods for optimal control problems
- A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization
This page was built for software: MINPACK-2