On data preconditioning for regularized loss minimization
From MaRDI portal
Publication:285940
DOI10.1007/s10994-015-5536-6zbMath1357.68190arXiv1408.3115MaRDI QIDQ285940
Rong Jin, Tianbao Yang, Shenghuo Zhu, Qihang Lin
Publication date: 19 May 2016
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1408.3115
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Erratum to: ``Minimizing finite sums with the stochastic average gradient
- Pegasos: primal estimated sub-gradient solver for SVM
- ``Preconditioning for feature selection and regression in high-dimensional problems
- Introductory lectures on convex optimization. A basic course.
- IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Iterative Solution Methods
- Weighted SGD for ℓp Regression with Randomized Preconditioning
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Sparsity and incoherence in compressive sampling
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- The elements of statistical learning. Data mining, inference, and prediction
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm