Nonparametric distributed learning under general designs
DOI10.1214/20-EJS1733zbMATH Open1466.62364MaRDI QIDQ2199703FDOQ2199703
Meimei Liu, Guang Cheng, Zuofeng Shang
Publication date: 14 September 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1597975224
Recommendations
- An asymptotic analysis of distributed nonparametric methods
- Distributed kernel ridge regression with communications
- Distributed linear regression by averaging
- Distributed nonparametric function estimation: optimal rate of convergence and cost of adaptation
- On the optimality of averaging in distributed statistical learning
Nonparametric regression and quantile regression (62G08) Nonparametric hypothesis testing (62G10) General nonlinear regression (62J02) Ridge regression; shrinkage estimators (Lasso) (62J07) Minimax procedures in statistical decision theory (62C20) Response surface designs (62K20)
Cites Work
- High-dimensional additive modeling
- A central limit theorem for generalized quadratic forms
- Additive regression and other nonparametric models
- Generalized likelihood ratio statistics and Wilks phenomenon
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- On the mathematical foundations of learning
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Minimax optimal rates of estimation in high dimensional additive models
- Maximum penalized likelihood estimation. Volume II: Regression
- Local Rademacher complexities
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Asymptotically minimax hypothesis testing for nonparametric alternatives. I
- Title not available (Why is that?)
- The covering number in learning theory
- Local and global asymptotic inference in smoothing spline models
- Divide and conquer kernel ridge regression: a distributed algorithm with minimax optimal rates
- Understanding Gaussian Process Regression Using the Equivalent Kernel
- Mercer’s Theorem, Feature Maps, and Smoothing
- Randomized sketches for kernels: fast and optimal nonparametric regression
- Title not available (Why is that?)
- Bayesian aggregation of average data: an application in drug development
- Computational Limits of A Distributed Algorithm For Smoothing Spline
- Title not available (Why is that?)
- Nonparametric Bayesian Aggregation for Massive Data
- Distributed statistical estimation and rates of convergence in normal approximation
- The Local Geometry of Testing in Ellipses: Tight Control via Localized Kolmogorov Widths
- An asymptotic analysis of distributed nonparametric methods
Cited In (3)
Uses Software
This page was built for publication: Nonparametric distributed learning under general designs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2199703)