scientific article; zbMATH DE number 7415083
From MaRDI portal
Publication:5159408
No author found.
Publication date: 27 October 2021
Full work available at URL: https://arxiv.org/abs/1809.09910
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- Approximation analysis of learning algorithms for support vector regression and quantile regression
- Quantile regression with \(\ell_1\)-regularization and Gaussian kernels
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Estimating conditional quantiles with the help of the pinball loss
- Learning with varying insensitive loss
- Theory of reproducing kernels and applications
- Regularization in kernel learning
- Concentration estimates for learning with unbounded sampling
- Deep metric learning using triplet network
- Linearized two-layers neural networks in high dimension
- Efficient regularized least-squares algorithms for conditional ranking on relational data
- Optimal rates for the regularized least-squares algorithm
- Learning rates of least-square regularized regression
- Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates
- Learning Theory
- Support Vector Machines
- Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression
- A universal sampling method for reconstructing signals with simple Fourier transforms
- Learning Theory and Kernel Machines
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Indefinite Proximity Learning: A Review
- Theory of Reproducing Kernels
- Wide neural networks of any depth evolve as linear models under gradient descent *
This page was built for publication: