Convergence Rate of Incremental Gradient and Incremental Newton Methods (Q5237308): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Set OpenAlex properties.
 
(2 intermediate revisions by 2 users not shown)
Property / arXiv ID
 
Property / arXiv ID: 1510.08562 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental Least Squares Methods and the Extended Kalman Filter / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Class of Incremental Gradient Methods for Least Squares Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3151174 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3452586 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient Convergence in Gradient methods with Errors / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Convergent Incremental Gradient Method with a Constant Step Size / rank
 
Normal rank
Property / cites work
 
Property / cites work: On‐line learning for very large data sets / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers / rank
 
Normal rank
Property / cites work
 
Property / cites work: On a Stochastic Approximation Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A globally convergent incremental Newton method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Why random reshuffling beats stochastic gradient descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Convergence Rate of Incremental Aggregated Gradient Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Logarithmic regret algorithms for online convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Adaptive Associative Memory Principle / rank
 
Normal rank
Property / cites work
 
Property / cites work: The incremental Gauss-Newton algorithm with adaptive stepsize rule / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2752037 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental Subgradient Methods for Nondifferentiable Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distributed Subgradient Methods for Multi-Agent Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Stochastic Approximation Approach to Stochastic Programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Acceleration of Stochastic Approximation by Averaging / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Collaborative Training Algorithm for Distributed Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Parallel stochastic gradient algorithms for large-scale matrix completion / rank
 
Normal rank
Property / cites work
 
Property / cites work: EXTRA: An Exact First-Order Algorithm for Decentralized Consensus Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental gradient algorithms with stepsizes bounded away from zero / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2980820424 / rank
 
Normal rank

Latest revision as of 10:33, 30 July 2024

scientific article; zbMATH DE number 7118630
Language Label Description Also known as
English
Convergence Rate of Incremental Gradient and Incremental Newton Methods
scientific article; zbMATH DE number 7118630

    Statements

    Convergence Rate of Incremental Gradient and Incremental Newton Methods (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    17 October 2019
    0 references
    0 references
    0 references
    0 references
    0 references
    convex optimization
    0 references
    incremental algorithms
    0 references
    first-order methods
    0 references
    convergence rate
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references