The following pages link to (Q5386442):
Displaying 34 items.
- Nearly unbiased variable selection under minimax concave penalty (Q117379) (← links)
- Derivation of an artificial gene to improve classification accuracy upon gene selection (Q441727) (← links)
- Boosting algorithms: regularization, prediction and model fitting (Q449780) (← links)
- A boosting approach for supervised Mahalanobis distance metric learning (Q645912) (← links)
- Improved customer choice predictions using ensemble methods (Q872292) (← links)
- Deep learning of support vector machines with class probability output networks (Q890735) (← links)
- Multivariate spline analysis for multiplicative models: estimation, testing and application to climate change (Q901277) (← links)
- Selection-fusion approach for classification of datasets with missing values (Q962827) (← links)
- Sharpening Occam's razor (Q1007542) (← links)
- Parallelizing AdaBoost by weights dynamics (Q1019879) (← links)
- A concrete statistical realization of Kleinberg's stochastic dicrimination for pattern recognition. I: Two-class classification (Q1431432) (← links)
- Boosting imbalanced data learning with Wiener process oversampling (Q1712569) (← links)
- Effective DNA binding protein prediction by using key features via Chou's general PseAAC (Q1716796) (← links)
- Student and school performance across countries: a machine learning approach (Q1749516) (← links)
- On the Bayes-risk consistency of regularized boosting methods. (Q1884602) (← links)
- On filtering by means of generalized integral images: a review and applications (Q1937808) (← links)
- On the equivalence of weak learnability and linear separability: new relaxations and efficient boosting algorithms (Q1959594) (← links)
- Analysis of a two-layer neural network via displacement convexity (Q1996787) (← links)
- Implicit consensus clustering from multiple graphs (Q2066641) (← links)
- RADE: resource-efficient supervised anomaly detection using decision tree-based ensemble methods (Q2071508) (← links)
- Boosting for quantum weak learners (Q2107988) (← links)
- Highly accurate machine learning model for kinetic energy density functional (Q2235652) (← links)
- A novel multi-view learning developed from single-view patterns (Q2275990) (← links)
- Wombit: a portfolio bit-vector solver using word-level propagation (Q2323450) (← links)
- Boosting as a kernel-based method (Q2331677) (← links)
- Aggregation for Gaussian regression (Q2456016) (← links)
- Analysis of boosting algorithms using the smooth margin function (Q2473080) (← links)
- Using natural class hierarchies in multi-class visual classification (Q2495923) (← links)
- Boosting for high-dimensional linear models (Q2497175) (← links)
- Boosting the partial least square algorithm for regression modelling (Q3181328) (← links)
- AN EMPIRICAL STUDY OF BOOSTED NEURAL NETWORK FOR PARTICLE CLASSIFICATION IN HIGH ENERGY COLLISIONS (Q3435305) (← links)
- OR Practice–Data Analytics for Optimal Detection of Metastatic Prostate Cancer (Q5003717) (← links)
- Do prior information on performance of individual classifiers for fusion of probabilistic classifier outputs matter? (Q6147502) (← links)
- Data-driven state-of-charge prediction of a storage cell using ABC/GBRT, ABC/MLP and Lasso machine learning techniques (Q6175204) (← links)