Universal sieve-based strategies for efficient estimation using machine learning tools
From MaRDI portal
Publication:1983607
DOI10.3150/20-BEJ1309zbMath1476.62067arXiv2003.01856MaRDI QIDQ1983607
Hongxiang Qiu, Marco Carone, Alex Luedtke
Publication date: 10 September 2021
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.01856
Ridge regression; shrinkage estimators (Lasso) (62J07) Nonparametric estimation (62G05) Learning and adaptive systems in artificial intelligence (68T05) Applications of sieve methods (11N36)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Nearly unbiased variable selection under minimax concave penalty
- Contributions to a general asymptotic statistical theory. With the assistance of W. Wefelmeyer
- Convergence rates and asymptotic normality for series estimators
- On methods of sieves and penalization
- Nonparametric estimators which can be ``plugged-in.
- Targeted learning in data science. Causal inference for complex longitudinal studies
- Unified methods for censored longitudinal data and causality
- Inefficient estimators of the bivariate survival function for three models
- Weak convergence and empirical processes. With applications to statistics
- Investigating Smooth Multiple Regression by the Method of Average Derivatives
- Double/debiased machine learning for treatment and structural parameters
- Multidimensional Variation for Quasi-Monte Carlo
- Asymptotic inference of causal effects with observational studies trimmed by the estimated propensity scores
- Twicing Kernels and a Small Bias Property of Semiparametric Estimators
- The bootstrap and Edgeworth expansion
- Stochastic gradient boosting.
- Nonparametric variable importance assessment using machine learning techniques