Unregularized online learning algorithms with general loss functions

From MaRDI portal
Publication:504379


DOI10.1016/j.acha.2015.08.007zbMath1382.68204arXiv1503.00623OpenAlexW2964224116WikidataQ58758915 ScholiaQ58758915MaRDI QIDQ504379

Ding-Xuan Zhou, Yiming Ying

Publication date: 16 January 2017

Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1503.00623



Related Items

Deep distributed convolutional neural networks: Universality, Unnamed Item, Distributed kernel gradient descent algorithm for minimum error entropy principle, Differentially private SGD with non-smooth losses, Theory of deep convolutional neural networks: downsampling, Convergence of online pairwise regression learning with quadratic loss, Convergence analysis for kernel-regularized online regression associated with an RRKHS, Federated learning for minimizing nonsmooth convex loss functions, High-probability generalization bounds for pointwise uniformly stable algorithms, Online Pairwise Learning Algorithms, Analysis of Online Composite Mirror Descent Algorithm, Online minimum error entropy algorithm with unbounded sampling, Kernel gradient descent algorithm for information theoretic learning, Generalization ability of online pairwise support vector machine, Unnamed Item, Convergence of online mirror descent, Online pairwise learning algorithms with convex loss functions, Online regularized pairwise learning with least squares loss, Fast and strong convergence of online learning algorithms, Deep neural networks for rotation-invariance approximation and learning, Analysis of regularized Nyström subsampling for regression functions of low smoothness, Analysis of singular value thresholding algorithm for matrix completion, Iterative gradient descent for outlier detection, Error analysis of the kernel regularized regression based on refined convex losses and RKBSs, Error analysis of the moving least-squares regression learning algorithm with β-mixing and non-identical sampling



Cites Work