Minimax-optimal rates for sparse additive models over kernel classes via convex programming
From MaRDI portal
Abstract: Sparse additive models are families of -variate functions that have the additive decomposition , where is an unknown subset of cardinality . In this paper, we consider the case where each univariate component function lies in a reproducing kernel Hilbert space (RKHS), and analyze a method for estimating the unknown function based on kernels combined with -type convex regularization. Working within a high-dimensional framework that allows both the dimension and sparsity to increase with , we derive convergence rates (upper bounds) in the and norms over the class of sparse additive models with each univariate function in the unit ball of a univariate RKHS with bounded kernel function. We complement our upper bounds by deriving minimax lower bounds on the error, thereby showing the optimality of our method. Thus, we obtain optimal minimax rates for many interesting classes of sparse additive models, including polynomials, splines, and Sobolev classes. We also show that if, in contrast to our univariate conditions, the multivariate function class is assumed to be globally bounded, then much faster estimation rates are possible for any sparsity , showing that global boundedness is a significant restriction in the high-dimensional setting.
Recommendations
- Minimax optimal rates of estimation in high dimensional additive models
- Sparse additive models
- Minimax optimal estimation in partially linear additive models under high dimension
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Doubly penalized estimation in additive regression with high-dimensional data
Cited in
(75)- Statistical inference in sparse high-dimensional additive models
- Randomized sketches for kernel CCA
- Locally adaptive sparse additive quantile regression model with TV penalty
- A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model
- Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies
- Block-wise primal-dual algorithms for large-scale doubly penalized ANOVA modeling
- Guaranteed Functional Tensor Singular Value Decomposition
- Empirical Bayes oracle uncertainty quantification for regression
- The recovery of ridge functions on the hypercube suffers from the curse of dimensionality
- Minimax optimal estimation in partially linear additive models under high dimension
- Metamodel construction for sensitivity analysis
- Latent Network Structure Learning From High-Dimensional Multivariate Point Processes
- Large Scale Prediction with Decision Trees
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- On nonparametric randomized sketches for kernels with further smoothness
- Optimal prediction for high-dimensional functional quantile regression in reproducing kernel Hilbert spaces
- Algorithms for learning sparse additive models with interactions in high dimensions
- Minimax-optimal nonparametric regression in high dimensions
- Sparse additive regression on a regular lattice
- Distributed Bayesian inference in massive spatial data
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- Adaptive variable selection in nonparametric sparse regression
- Detection of sparse additive functions
- Statistical inference in compound functional models
- Extreme eigenvalues of nonlinear correlation matrices with applications to additive models
- PAC-Bayesian estimation and prediction in sparse additive models
- A unified penalized method for sparse additive quantile models: an RKHS approach
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Approximation properties of certain operator-induced norms on Hilbert spaces
- Doubly penalized estimation in additive regression with high-dimensional data
- Multiple Kernel Learningの学習理論
- Efficient functional Lasso kernel smoothing for high-dimensional additive regression
- Nonparametric distributed learning under general designs
- DDAC-SpAM: A Distributed Algorithm for Fitting High-dimensional Sparse Additive Models with Feature Division and Decorrelation
- Sensitivity Analysis via the Proportion of Unmeasured Confounding
- Entropy and sampling numbers of classes of ridge functions
- Sparse high-dimensional varying coefficient model: nonasymptotic minimax study
- Variable selection consistency of Gaussian process regression
- Asymptotically faster estimation of high-dimensional additive models using subspace learning
- Penalized kernel quantile regression for varying coefficient models
- The Lasso for high dimensional regression with a possible change point
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Learning general sparse additive models from point queries in high dimensions
- A semiparametric model for matrix regression
- Pairwise learning problems with regularization networks and Nyström subsampling approach
- Optimal and Safe Estimation for High-Dimensional Semi-Supervised Learning
- Optimal policy evaluation using kernel-based temporal difference methods
- Automatic component selection in additive modeling of French national electricity load forecasting
- scientific article; zbMATH DE number 7625155 (Why is no real title available?)
- Regression in Tensor Product Spaces by the Method of Sieves
- Kernel meets sieve: post-regularization confidence bands for sparse additive model
- Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points
- Kernel Knockoffs Selection for Nonparametric Additive Models
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- Error analysis for coefficient-based regularized regression in additive models
- Approximate nonparametric quantile regression in reproducing kernel Hilbert spaces via random projection
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Information based complexity for high dimensional sparse functions
- Discovering model structure for partially linear models
- Nonlinear Variable Selection via Deep Neural Networks
- Sparse additive support vector machines in bounded variation space
- Hierarchical Total Variations and Doubly Penalized ANOVA Modeling for Multivariate Nonparametric Regression
- Minimax optimal rates of estimation in high dimensional additive models
- Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models
- Global Rate Optimality of Integral Curve Estimators in High Order Tensor Models
- Lower bounds on the noiseless worst-case complexity of efficient global optimization
- Backfitting algorithms for total-variation and empirical-norm penalized additive modelling with high-dimensional data
- Sparse multiple kernel learning: minimax rates with random projection
- Low-Rank Covariance Function Estimation for Multidimensional Functional Data
- Cross-validation for selecting a model selection procedure
- Stochastic continuum-armed bandits with additive models: minimax regrets and adaptive algorithm
- Bayesian Model Selection in Additive Partial Linear Models Via Locally Adaptive Splines
This page was built for publication: Minimax-optimal rates for sparse additive models over kernel classes via convex programming
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5405123)