Sparsity in multiple kernel learning
DOI10.1214/10-AOS825zbMATH Open1204.62086arXiv1211.2998OpenAlexW2058007550MaRDI QIDQ620564FDOQ620564
Authors: Ming Yuan, Vladimir Koltchinskii
Publication date: 19 January 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1211.2998
Recommendations
reproducing kernel Hilbert spacessparsityhigh dimensionalityoracle inequalitymultiple kernel learningrestricted isometry
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of functional analysis in probability theory and statistics (46N30) Inequalities; stochastic orderings (60E15) Nonparametric inference (62G99)
Cites Work
- High-dimensional additive modeling
- Weak convergence and empirical processes. With applications to statistics
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Theory of Reproducing Kernels
- Title not available (Why is that?)
- Introduction to nonparametric estimation
- New concentration inequalities in product spaces
- Component selection and smoothing in multivariate nonparametric regression
- A Bennett concentration inequality and its application to suprema of empirical processes
- Consistency of the group Lasso and multiple kernel learning
- The Dantzig selector and sparsity oracle inequalities
- Title not available (Why is that?)
- Learning the kernel matrix with semidefinite programming
- Learning the kernel function via regularization
- Statistical performance of support vector machines
- Learning Bounds for Support Vector Machines with Learned Kernels
- Sparsity in penalized empirical risk minimization
- Sparse recovery in convex hulls via entropy penalization
Cited In (70)
- Locally adaptive sparse additive quantile regression model with TV penalty
- Estimates on learning rates for multi-penalty distribution regression
- Block-wise primal-dual algorithms for large-scale doubly penalized ANOVA modeling
- Guaranteed Functional Tensor Singular Value Decomposition
- Learning rates for partially linear support vector machine in high dimensions
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- DDAC-SpAM: A Distributed Algorithm for Fitting High-dimensional Sparse Additive Models with Feature Division and Decorrelation
- Asymptotically faster estimation of high-dimensional additive models using subspace learning
- Automatic component selection in additive modeling of French national electricity load forecasting
- Sparse additive support vector machines in bounded variation space
- Hierarchical Total Variations and Doubly Penalized ANOVA Modeling for Multivariate Nonparametric Regression
- Decentralized learning over a network with Nyström approximation using SGD
- Backfitting algorithms for total-variation and empirical-norm penalized additive modelling with high-dimensional data
- Sparse multiple kernel learning: minimax rates with random projection
- Randomized sketches for kernel CCA
- Multikernel regression with sparsity constraint
- Grouping strategies and thresholding for high dimensional linear models
- Randomized multi-scale kernels learning with sparsity constraint regularization for regression
- Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression
- A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model
- Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies
- On the convergence rate of \(l_{p}\)-norm multiple kernel learning
- Multiple Kernel Learning for Sparse Representation-Based Classification
- Metamodel construction for sensitivity analysis
- Minimax optimal estimation in partially linear additive models under high dimension
- Learning non-parametric basis independent models from point queries via low-rank methods
- Learning rates for classification with Gaussian kernels
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Optimal prediction for high-dimensional functional quantile regression in reproducing kernel Hilbert spaces
- Minimax-optimal nonparametric regression in high dimensions
- Sparse Approximation of a Kernel Mean
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Title not available (Why is that?)
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Statistical inference in compound functional models
- Extreme eigenvalues of nonlinear correlation matrices with applications to additive models
- PAC-Bayesian estimation and prediction in sparse additive models
- A unified penalized method for sparse additive quantile models: an RKHS approach
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Sparse kernel SVMs via cutting-plane training
- Variable selection in additive quantile regression using nonconcave penalty
- Doubly penalized estimation in additive regression with high-dimensional data
- Multiple Kernel Learningの学習理論
- Learning sparse conditional distribution: an efficient kernel-based approach
- Minimax and adaptive prediction for functional linear regression
- Regularizers for structured sparsity
- Sparse RKHS estimation via globally convex optimization and its application in LPV-IO identification
- Greedy Kernel Approximation for Sparse Surrogate Modeling
- Multiple spectral kernel learning and a Gaussian complexity computation
- Title not available (Why is that?)
- Significant vector learning to construct sparse kernel regression models
- A semiparametric model for matrix regression
- Learning general sparse additive models from point queries in high dimensions
- Kernel meets sieve: post-regularization confidence bands for sparse additive model
- Variable sparsity kernel learning
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- Inference for high-dimensional varying-coefficient quantile regression
- Kernel Knockoffs Selection for Nonparametric Additive Models
- Regularizing double machine learning in partially linear endogenous models
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Approximate nonparametric quantile regression in reproducing kernel Hilbert spaces via random projection
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- The two-sample problem for Poisson processes: adaptive tests with a nonasymptotic wild bootstrap approach
- Nonlinear Variable Selection via Deep Neural Networks
- Information based complexity for high dimensional sparse functions
- Nonparametric variable screening for multivariate additive models
- Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions
- Additive model selection
- Sparsity in penalized empirical risk minimization
- Kernel Ordinary Differential Equations
Uses Software
This page was built for publication: Sparsity in multiple kernel learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q620564)