Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization (Q5962715): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(5 intermediate revisions by 5 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: Pegasos / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2118545728 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1309.2375 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3096171 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2880998 / rank
 
Normal rank
Property / cites work
 
Property / cites work: 10.1162/15324430260185628 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth Optimization with Approximate Gradient / rank
 
Normal rank
Property / cites work
 
Property / cites work: First-order methods of smooth convex optimization with inexact oracle / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated, Parallel, and Proximal Coordinate Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2880897 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth minimization of non-smooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient methods for minimizing composite functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5396661 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Understanding Machine Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Pegasos: primal estimated sub-gradient solver for SVM / rank
 
Normal rank
Property / cites work
 
Property / cites work: Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2896156 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the dual formulation of regularized linear systems with convex risks / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 11:10, 11 July 2024

scientific article; zbMATH DE number 6544654
Language Label Description Also known as
English
Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
scientific article; zbMATH DE number 6544654

    Statements

    Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization (English)
    0 references
    0 references
    0 references
    23 February 2016
    0 references
    A minimization problem related to the machine learning is considered. The objective function is a regularized loss function obtained by adding a regularizer to the convex loss function. A new version of the stochastic dual coordinate ascent method is proposed and its high convergence rate is proven. This result enables improvement of the key machine learning algorithms including support vector machines (SVM), ridge regression, Lasso, and multiclass SVM. The experimental results are included which corroborate the theoretical findings.
    0 references
    stochastic optimization
    0 references
    machine learning
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references