Double machine learning with gradient boosting and its application to the Big N audit quality effect
DOI10.1016/J.JECONOM.2020.01.018zbMATH Open1456.62320OpenAlexW2938609004MaRDI QIDQ2305992FDOQ2305992
Authors: Jui-Chung Yang, Hui-Ching Chuang, Chung-Ming Kuan
Publication date: 20 March 2020
Published in: Journal of Econometrics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jeconom.2020.01.018
Recommendations
- Double machine learning-based programme evaluation under unconfoundedness
- Double-bagging: Combining classifiers by bootstrap aggregation
- Evaluating (weighted) dynamic treatment effects by double machine learning
- Double/debiased machine learning for difference-in-differences models
- Double/debiased machine learning for treatment and structural parameters
- 2-step gradient boosting approach to selectivity bias correction in tax audit: an application to the VAT gap in Italy
- Double/debiased machine learning for logistic partially linear model
- Using boosting to prune double-bagging ensembles
- Support vector machines, decision trees and neural networks for auditor selection
double machine learninggradient boostingaverage treatment effectaudit qualityBig \(N\) effectperformance-matched discretionary accruals
Computational methods for problems pertaining to statistics (62-08) Applications of statistics to economics (62P20) Statistical aspects of big data and data science (62R07) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Random forests
- On the conditions used to prove oracle results for the Lasso
- Mostly harmless econometrics. An empiricist's companion.
- Double/debiased machine learning for treatment and structural parameters
- Title not available (Why is that?)
- Title not available (Why is that?)
- Additive regression and other nonparametric models
- Least squares after model selection in high-dimensional sparse models
- Inference on treatment effects after selection among high-dimensional controls
- Program evaluation and causal inference with high-dimensional data
- Approximate residual balancing: debiased inference of average treatment effects in high dimensions
- Boosting with early stopping: convergence and consistency
- Analysis of a random forests model
- On the Bayes-risk consistency of regularized boosting methods.
- Intervention Analysis with Applications to Economic and Environmental Problems
- Simple least squares estimator for treatment effects using propensity score residuals
- Computer age statistical inference. Algorithms, evidence, and data science
Cited In (2)
This page was built for publication: Double machine learning with gradient boosting and its application to the Big \(N\) audit quality effect
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2305992)