On bagging and nonlinear estimation

From MaRDI portal
Publication:866611

DOI10.1016/j.jspi.2006.06.002zbMath1104.62047OpenAlexW2068852101MaRDI QIDQ866611

Jerome H. Friedman, Hall, Peter

Publication date: 14 February 2007

Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)

Full work available at URL: http://hdl.handle.net/1885/21426




Related Items

Comparing boosting and bagging for decision trees of rankingsProperties of Bagged Nearest Neighbour ClassifiersEstimation and Inference of Heterogeneous Treatment Effects using Random ForestsRobustify Financial Time Series Forecasting with BaggingSubsampling based variable selection for generalized linear modelsUnnamed ItemLarge Scale Prediction with Decision TreesUnnamed ItemBootstrap confidence interval for a correlation curveAugmenting the bootstrap to analyze high dimensional genomic dataCross-validated bagged learningPredictive learning via rule ensemblesAggregating classifiers with ordinal response structureApplications of hyperellipsoidal prediction regionsLooking for lumps: boosting and bagging for density estimation.Improving nonparametric regression methods by bagging and boosting.Out-of-bag estimation of the optimal sample size in baggingMulti-Resolution Functional ANOVA for Large-Scale, Many-Input Computer ExperimentsExact bootstrap \(k\)-nearest neighbor learnersESTIMATING A PARAMETER WHEN IT IS KNOWN THAT THE PARAMETER EXCEEDS A GIVEN VALUEBootstrapping multiple linear regression after variable selectionInferences from Cross-Sectional, Stochastic Frontier ModelsComputational efficiency of bagging bootstrap bandwidth selection for density estimation with big dataAdditive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)Analyzing baggingInference for Optimal Dynamic Treatment Regimes Using an Adaptive m ‐Out‐of‐ n Bootstrap SchemeProbability estimation with machine learning methods for dichotomous and multicategory outcome: Theory



Cites Work