Ensembling neural networks: Many could be better than all
From MaRDI portal
Publication:1605287
DOI10.1016/S0004-3702(02)00190-XzbMath0995.68077MaRDI QIDQ1605287
Jianxin Wu, Zhi-Hua Zhou, Wei Tang
Publication date: 15 July 2002
Published in: Artificial Intelligence (Search for Journal in Brave)
neural networksgenetic algorithmmachine learningbaggingboostingbias-variance decompositionselective ensembleneural network ensemble
Related Items (55)
Federated personalized random forest for human activity recognition ⋮ Kernel matching pursuit classifier ensemble ⋮ A multi-kernel support tensor machine for classification with multitype multiway data and an application to cross-selling recommendations ⋮ Cancer classification using ensemble of neural networks with multiple significant gene subsets ⋮ Particle swarm optimization based selective ensemble of online sequential extreme learning machine ⋮ GNSS/low-cost MEMS-INS integration using variational Bayesian adaptive cubature Kalman smoother and ensemble regularized ELM ⋮ Interpreting deep learning models with marginal attribution by conditioning on quantiles ⋮ Neural network ensembles: Immune-inspired approaches to the diversity of components ⋮ Weighted classifier ensemble based on quadratic form ⋮ Sparse ensembles using weighted combination methods based on linear programming ⋮ Bayesian neural network priors for edge-preserving inversion ⋮ Rough subspace-based clustering ensemble for categorical data ⋮ Parallel orthogonal deep neural network ⋮ Corrigendum to ``Ensembling neural networks: many could be better than all ⋮ A Neural Approach to Improve the Lee-Carter Mortality Density Forecasts ⋮ Exploratory machine learning with unknown unknowns ⋮ Greedy optimization classifiers ensemble based on diversity ⋮ An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes ⋮ RandGA: injecting randomness into parallel genetic algorithm for variable selection ⋮ Learning with mitigating random consistency from the accuracy measure ⋮ Adaptive linear and normalized combination of radial basis function networks for function approximation and regression ⋮ Structural combination of seasonal exponential smoothing forecasts applied to load forecasting ⋮ On a method for constructing ensembles of regression models ⋮ A three-way selective ensemble model for multi-label classification ⋮ Neural network ensembles: evaluation of aggregation algorithms ⋮ A dynamic overproduce-and-choose strategy for the selection of classifier ensembles ⋮ An efficient and robust adaptive sampling method for polynomial chaos expansion in sparse Bayesian learning framework ⋮ Learning similarity with cosine similarity ensemble ⋮ A deep learning semiparametric regression for adjusting complex confounding structures ⋮ Collective-agreement-based pruning of ensembles ⋮ Using boosting to prune double-bagging ensembles ⋮ Use of genetic algorithm to design optimal neural network structure ⋮ Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography ⋮ Network traffic classification based on ensemble learning and co-training ⋮ Algorithm of designing compound recognition system on the basis of combining classifiers with simultaneous splitting feature space into competence areas ⋮ Face recognition from a single image per person: a survey ⋮ A new genetic feature selection with neural network ensemble ⋮ A probabilistic model of classifier competence for dynamic ensemble selection ⋮ A hybrid learning-based model for on-line monitoring and diagnosis of out-of-control signals in multivariate manufacturing processes ⋮ Using ensemble and metaheuristics learning principles with artificial neural networks to improve due date prediction performance ⋮ Improving regression predictions using individual point reliability estimates based on critical error scenarios ⋮ EROS: Ensemble rough subspaces ⋮ A novel margin-based measure for directed hill climbing ensemble pruning ⋮ Semi-supervised learning using ensembles of multiple 1D-embedding-based label boosting ⋮ Selective ensemble of SVDDs with Renyi entropy based diversity measure ⋮ Modeling resonant frequency of microstrip antenna based on neural network ensemble ⋮ Ensemble classification based on generalized additive models ⋮ Ensemble component selection for improving ICA based microarray data prediction models ⋮ Pruning variable selection ensembles ⋮ Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling ⋮ A nonparametric ensemble binary classifier and its statistical properties ⋮ Multiple graph regularized graph transduction via greedy gradient Max-Cut ⋮ Explainable online ensemble of deep neural network pruning for time series forecasting ⋮ Impulse response function identification of linear mechanical systems based on Kautz basis expansion with multiple poles ⋮ A two-stage exact algorithm for optimization of neural network ensemble
Uses Software
Cites Work
This page was built for publication: Ensembling neural networks: Many could be better than all