Variable Selection for Nonparametric Learning with Power Series Kernels
From MaRDI portal
Publication:5214363
DOI10.1162/neco_a_01212zbMath1435.62136arXiv1806.00569OpenAlexW2805257271WikidataQ91508578 ScholiaQ91508578MaRDI QIDQ5214363
Kenta Kanamori, Mitsuaki Nishikimi, Takafumi Kanamori, Wataru Kumagai, Kota Matsui
Publication date: 7 February 2020
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.00569
Density estimation (62G07) Ridge regression; shrinkage estimators (Lasso) (62J07) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Power series kernels
- Derivative reproducing properties for kernel methods in learning theory
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- Optimal rates for the regularized least-squares algorithm
- Density Ratio Estimation in Machine Learning
- Nonparametric sparsity and regularization
- Better Subset Regression Using the Nonnegative Garrote
- Support Vector Machines
- Asymptotic Statistics
- 10.1162/153244303322753616
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Flexible variable selection for recovering sparsity in nonadditive nonparametric models
- Density Estimation in Infinite Dimensional Exponential Families
- Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
- On the Non-Negative Garrotte Estimator