Optimal rates for coefficient-based regularized regression
From MaRDI portal
Publication:2330932
DOI10.1016/j.acha.2017.11.005zbMath1467.62062OpenAlexW2773342374MaRDI QIDQ2330932
Publication date: 23 October 2019
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.acha.2017.11.005
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items
Unnamed Item ⋮ Capacity dependent analysis for functional online learning algorithms ⋮ Deep learning theory of distribution regression with CNNs ⋮ Coefficient-based regularized distribution regression ⋮ Online regularized learning algorithm for functional data ⋮ Nyström subsampling method for coefficient-based regularized regression ⋮ Analysis of regularized least-squares in reproducing kernel Kreĭn spaces ⋮ Distributed learning with indefinite kernels
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning with coefficient-based regularization and \(\ell^1\)-penalty
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- \(L_{2}\) boosting in kernel regression
- Least square regression with indefinite kernels and coefficient regularization
- Families of alpha-, beta- and gamma-divergences: flexible and robust measures of similarities
- On regularization algorithms in learning theory
- Elastic-net regularization in learning theory
- Reproducing kernel Banach spaces with the \(\ell^1\) norm
- Weak convergence and empirical processes. With applications to statistics
- Regularization networks with indefinite kernels
- Learning theory estimates for coefficient-based regularized regression
- Concentration estimates for learning with unbounded sampling
- Optimal rates for the regularized least-squares algorithm
- Learning with sample dependent hypothesis spaces
- Shannon sampling. II: Connections to learning theory
- Sous-espaces d'espaces vectoriels topologiques et noyaux associés. (Noyaux reproduisants.)
- Learning theory estimates via integral operators and their approximations
- Learning Theory
- Support Vector Machines
- Remarks on Inequalities for Large Deviation Probabilities
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Thresholded spectral algorithms for sparse approximations
- Learning theory of distributed spectral algorithms
- Indefinite Proximity Learning: A Review
- Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
- An Introduction to Matrix Concentration Inequalities
- Theory of Reproducing Kernels
- Scattered Data Approximation
This page was built for publication: Optimal rates for coefficient-based regularized regression