Derivative-Free optimization algorithms. These algorithms do not require gradient information. More importantly, they can be used to solve non-smooth optimization problems.
Cited in
(34)- atRisk
- Optimal design for adaptive smoothing splines
- Practical initialization of the Nelder-Mead method for computationally expensive optimization problems
- Proper initialization is crucial for the Nelder-Mead simplex search
- Empirical likelihood test for diagonal symmetry
- A jackknife empirical likelihood approach for testing the homogeneity of \(K\) variances
- A penalized simulated maximum likelihood approach in parameter estimation for stochastic differential equations
- Sklar's omega: a Gaussian copula-based framework for assessing agreement
- nloptr
- nlmrt
- adagio
- neldermead
- calibrar
- MORET
- DynTxRegime
- sklarsomega
- dfoalgos
- ODsplines
- mvord
- Noisy kriging-based optimization methods: a unified implementation within the DiceOptim package
- krippendorffsalpha
- reReg
- cops
- stops
- foreSIGHT
- hyperbrick
- garma
- CSTE
- npcs
- ConsReg
- stepPenal
- diffusion
- matrisk
- GeoModels
This page was built for software: dfoptim