On the limited memory BFGS method for large scale optimization (Q911463): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: tn / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: A combined conjugate-gradient quasi-Newton minimization algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: QN-like variable storage conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithm 630 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3882253 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some numerical experiments with variable-storage quasi-Newton algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Partitioned variable metric updates for large structured optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Local convergence analysis for partitioned quasi-Newton updates / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3313207 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Testing Unconstrained Optimization Software / rank
 
Normal rank
Property / cites work
 
Property / cites work: Preconditioning of Truncated-Newton Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Relationship between the BFGS and Conjugate Gradient Algorithms and Its Implications for New Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Updating Quasi-Newton Matrices with Limited Storage / rank
 
Normal rank
Property / cites work
 
Property / cites work: A discrete Newton algorithm for minimizing a function of many variables / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variable metric methods of minimisation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4107408 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Restart procedures for the conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Convergence of a New Conjugate Gradient Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Conjugate Gradient Methods with Inexact Searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: Matrix conditioning and nonlinear optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Conjugate Gradient Method and Trust Regions in Large Scale Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3914337 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3762082 / rank
 
Normal rank

Latest revision as of 15:08, 20 June 2024

scientific article
Language Label Description Also known as
English
On the limited memory BFGS method for large scale optimization
scientific article

    Statements

    On the limited memory BFGS method for large scale optimization (English)
    0 references
    0 references
    0 references
    0 references
    1989
    0 references
    The authors investigate the numerical performance of several optimization algorithms for solving smooth, unconstrained and in particular, large problems. They compare their own code based on limited BFGS-updates with the method of \textit{A. Buckley} and \textit{A. LeNir} [ACM Trans. Math. Software 11, 103-119 (1985; Zbl 0562.65043)], which requires additional conjugate gradient steps, with two conjugate gradient methods and with the partitioned quasi-Newton method of \textit{A. Griewank} and \textit{Ph. L. Toint} [in: Nonlinear optimization, NATO Conf., Ser., Ser. II, 301-312 (1982; Zbl 0563.90085); Math. Program. 28, 25-49 (1984; Zbl 0561.65045) and Numer. Math. 39, 429-448 (1982; Zbl 0505.65018)]. The results are based on 16 test problems where the dimension varies between 50 and 1000. Conclusions of the numerical tests are that the method of the authors is faster than the method of Buckley and LeNir with respect to number of function evaluations and execution time. The method also outperforms the conjugate gradient methods and is competitive to the partitioned quasi- Newton method on dense problems, but inferior on partitioned problems. Moreover scaling effects are evaluated and a convergence analysis for convex problems is presented.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    large scale optimization
    0 references
    computational study
    0 references
    comparison of algorithms
    0 references
    smooth unconstrained problems
    0 references
    numerical performance
    0 references
    conjugate gradient methods
    0 references
    partitioned quasi-Newton method
    0 references
    scaling effects
    0 references
    convergence analysis
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references