On empirical eigenfunction-based ranking with \(\ell^1\) norm regularization
From MaRDI portal
Publication:2256621
DOI10.1016/j.jat.2014.12.011zbMath1320.68152OpenAlexW2015851546MaRDI QIDQ2256621
Min Xu, Qin Fang, Shaofan Wang, Junbin Li
Publication date: 20 February 2015
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2014.12.011
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An empirical feature-based learning algorithm producing sparse approximations
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- The convergence rate of a regularized ranking algorithm
- Model selection for regularized least-squares algorithm in learning theory
- On regularization algorithms in learning theory
- Variation of discrete spectra
- The Hoffman-Wielandt inequality in infinite dimensions
- An approximation theory approach to learning with \(\ell^1\) regularization
- Computing the singular value decomposition with high relative accuracy
- Random matrix approximation of spectra of integral operators
- Learning with sample dependent hypothesis spaces
- Ranking and empirical minimization of \(U\)-statistics
- Learning theory estimates via integral operators and their approximations
- The variation of the spectrum of a normal matrix
- INDEFINITE KERNEL NETWORK WITH DEPENDENT SAMPLING
- Learning Theory
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Statistical Analysis of Bayes Optimal Subset Ranking
- An Orthogonal High Relative Accuracy Algorithm for the Symmetric Eigenproblem
- Theory of Reproducing Kernels
This page was built for publication: On empirical eigenfunction-based ranking with \(\ell^1\) norm regularization