Kernel conjugate gradient methods with random projections
From MaRDI portal
Recommendations
- Convergences of regularized algorithms and stochastic gradient methods with random projections
- Convergence rates of kernel conjugate gradient for random design regression
- Conjugate gradients for kernel machines
- Randomized sketches for kernels: fast and optimal nonparametric regression
- Compact convex projections
Cites work
- scientific article; zbMATH DE number 936298 (Why is no real title available?)
- 10.1162/15324430260185556
- A simple proof of the restricted isometry property for random matrices
- An introduction to matrix concentration inequalities
- An operator inequality
- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Convergence rates of kernel conjugate gradient for random design regression
- Convergences of regularized algorithms and stochastic gradient methods with random projections
- Cross-validation based adaptation for regularization operators in learning theory
- Decoding by Linear Programming
- Fast approximation of matrix coherence and statistical leverage
- Faster kernel ridge regression using sketching and preconditioning
- Kernel partial least squares for stationary data
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Learning Theory
- Learning theory estimates via integral operators and their approximations
- Methods of conjugate gradients for solving linear systems
- New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
- Norm Inequalities Equivalent to Heinz Inequality
- Nyström type subsampling analyzed as a regularized projection
- Optimal learning rates for kernel partial least squares
- Optimal rates for multi-pass stochastic gradient methods
- Optimal rates for regularization of statistical inverse learning problems
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Optimal rates for the regularized least-squares algorithm
- Randomized sketches for kernels: fast and optimal nonparametric regression
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Remarks on Inequalities for Large Deviation Probabilities
- Revisiting the Nyström method for improved large-scale machine learning
- Support Vector Machines
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
Cited in
(6)- Randomized sketches for kernel CCA
- Sampling Gaussian distributions in Krylov spaces with conjugate gradients
- Using Kronecker products to construct mimetic gradients
- Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two
- Kernel Conjugate Gradient Methods with Random Projections
- The Kernel Conjugate Gradient Algorithms
This page was built for publication: Kernel conjugate gradient methods with random projections
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1979923)