Backtracking gradient descent method and some applications in large scale optimisation. II: Algorithms and experiments (Q2234294): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(5 intermediate revisions by 3 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: MobileNetV2 / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: MNIST / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: GitHub / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s00245-020-09718-8 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W3083378728 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of the Iterates of Descent Methods for Analytic Cost Functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimization of functions having Lipschitz continuous first partial derivatives / rank
 
Normal rank
Property / cites work
 
Property / cites work: Limit Points of Sequences in Metric Spaces / rank
 
Normal rank
Property / cites work
 
Property / cites work: Analysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3151174 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient Convergence in Gradient methods with Errors / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization Methods for Large-Scale Machine Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4821526 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient methods of maximization / rank
 
Normal rank
Property / cites work
 
Property / cites work: The method of steepest descent for non-linear minimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Cauchy's method of minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization and dynamical systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Probabilistic Line Searches for Stochastic Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Stochastic Approximation Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2934059 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank

Latest revision as of 21:02, 26 July 2024

scientific article
Language Label Description Also known as
English
Backtracking gradient descent method and some applications in large scale optimisation. II: Algorithms and experiments
scientific article

    Statements

    Backtracking gradient descent method and some applications in large scale optimisation. II: Algorithms and experiments (English)
    0 references
    0 references
    0 references
    19 October 2021
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    automation of learning rates
    0 references
    backtracking
    0 references
    deep neural networks
    0 references
    random dynamical systems
    0 references
    global convergence
    0 references
    gradient descent
    0 references
    image classification
    0 references
    iterative optimisation
    0 references
    large scale optimisation
    0 references
    local minimum
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references