Improving importance estimation in pool-based batch active learning for approximate linear regression
From MaRDI portal
Publication:1942721
DOI10.1016/j.neunet.2012.09.002zbMath1258.68117OpenAlexW1984169813WikidataQ50781461 ScholiaQ50781461MaRDI QIDQ1942721
Nozomi Kurihara, Masashi Sugiyama
Publication date: 13 March 2013
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2012.09.002
inclusion probabilitycovariate shiftapproximate linear regressionimportance-weighted least-squaresP-ALICEpool-based batch active learning
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Active learning algorithm using the maximum weighted log-likelihood estimator
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- Efficient exploration through active learning for value function approximation in reinforcement learning
- A batch ensemble approach to active learning with model selection
- Pool-based active learning in approximate linear regression
- Robust weights and designs for biased regression models: Least squares and generalized \(M\)-estimation
- An outline of the theory of sampling systems
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- On the Theory of Systematic Sampling, II
- A Generalization of Sampling Without Replacement From a Finite Universe
- On the Theory of Sampling from Finite Populations
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Improving importance estimation in pool-based batch active learning for approximate linear regression