Optimization algorithm based on densification and dynamic canonical descent (Q2488893): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Changed an Item
Import241208061232 (talk | contribs)
Normalize DOI.
 
(4 intermediate revisions by 4 users not shown)
Property / DOI
 
Property / DOI: 10.1016/j.cam.2005.07.023 / rank
Normal rank
 
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.cam.2005.07.023 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1990537004 / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q57702424 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4209222 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5312537 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4257215 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4243098 / rank
 
Normal rank
Property / cites work
 
Property / cites work: <i>α</i>‐dense curves and global optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4692508 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization by Simulated Annealing / rank
 
Normal rank
Property / cites work
 
Property / cites work: Characterization and generation of \(\alpha\)-dense curves / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1016/J.CAM.2005.07.023 / rank
 
Normal rank

Latest revision as of 23:34, 18 December 2024

scientific article
Language Label Description Also known as
English
Optimization algorithm based on densification and dynamic canonical descent
scientific article

    Statements

    Optimization algorithm based on densification and dynamic canonical descent (English)
    0 references
    0 references
    0 references
    16 May 2006
    0 references
    The authors propose a fast and efficient derivative-free global optimization algorithm. The search domain \(X\) is assumed to be compact and convex, and the objective function \(f(x)\) to be continuous almost everywhere on \(X\). The algorithm aims at approximating the minimizer of \(f\) by finding out, at each iteration, a point along the coordinate direction which is better than the previous one, and by cycling through coordinate directions. Besides the number of variables of the original problem is reduced by the use of densification curves enabling to spare the evaluation of functional values regarding the same precision when exploring the space. A new approach for the variable reduction without any differentiability assumption is proposed which is based on a multidimensional reduction instead of a monodimensional one, i.e., the authors propose to reduce a multivariable function to a new multivariable function, but with much less variables than the original function. Finally, the performance of the method is tested on several known optimization problems and compared with its competitive methods like simulated annealing and genetic algorithms. Results of numerical experiments show a faster convergence for the algorithm proposed due to less the number of function evaluations.
    0 references
    Global optimization
    0 references
    Densification curves
    0 references
    Derivative-free methods
    0 references
    Variable reduction
    0 references
    comparison of methods
    0 references
    algorithm
    0 references
    performance
    0 references
    simulated annealing
    0 references
    genetic algorithms
    0 references
    numerical experiments
    0 references
    convergence
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references