Stopping rules for gradient methods for non-convex problems with additive noise in gradient (Q6051170): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth Optimization with Approximate Gradient / rank
 
Normal rank
Property / cites work
 
Property / cites work: First-order methods of smooth convex optimization with inexact oracle / rank
 
Normal rank
Property / cites work
 
Property / cites work: A stopping rule in iteration procedures for solving ill-posed problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Universal gradient methods for convex optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Cubic regularization of Newton method and its global performance / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3968054 / rank
 
Normal rank
Property / cites work
 
Property / cites work: New versions of Newton method: step-size choice, convergence domain and under-determined equations / rank
 
Normal rank

Latest revision as of 22:46, 2 August 2024

scientific article; zbMATH DE number 7740098
Language Label Description Also known as
English
Stopping rules for gradient methods for non-convex problems with additive noise in gradient
scientific article; zbMATH DE number 7740098

    Statements

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references