Finite-sum smooth optimization with SARAH (Q2149950): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Optimization Methods for Large-Scale Machine Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5396673 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: New Convergence Aspects of Stochastic Gradient Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Inexact SARAH algorithm for stochastic optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4969178 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Stochastic Approximation Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimizing finite sums with the stochastic average gradient / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization / rank
 
Normal rank

Revision as of 10:39, 29 July 2024

scientific article
Language Label Description Also known as
English
Finite-sum smooth optimization with SARAH
scientific article

    Statements

    Finite-sum smooth optimization with SARAH (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    27 June 2022
    0 references
    finite-sum
    0 references
    smooth
    0 references
    non-convex
    0 references
    convex
    0 references
    stochastic algorithm
    0 references
    variance reduction
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers