Sample size selection in optimization methods for machine learning (Q715253): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: An adaptive Monte Carlo algorithm for computing mixed logit estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Goldstein-Levitin-Polyak gradient projection method / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds / rank
 
Normal rank
Property / cites work
 
Property / cites work: Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal Distributed Online Prediction using Mini-Batches / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variable-number sample-path optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: De-noising by soft-thresholding / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2880998 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5517349 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Active Set Algorithm for Box Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Sample Average Approximation Method for Stochastic Discrete Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Newton's Method for Large Bound-Constrained Optimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Primal-dual subgradient methods for convex problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Acceleration of Stochastic Approximation by Averaging / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Stochastic Approximation Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A simulation-based approach to two-stage stochastic programming with recourse / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Rate of Convergence of Optimal Solutions of Monte Carlo Approximations of Stochastic Programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Analysis of Stochastic Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated Block-coordinate Relaxation for Regularized Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse Reconstruction by Separable Approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2896156 / rank
 
Normal rank

Revision as of 19:45, 5 July 2024

scientific article
Language Label Description Also known as
English
Sample size selection in optimization methods for machine learning
scientific article

    Statements

    Sample size selection in optimization methods for machine learning (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    2 November 2012
    0 references
    sample size selection
    0 references
    batch-type optimization methods
    0 references
    machine learning
    0 references
    Newton-like methods
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references