Bandwidth choice for nonparametric classification
From MaRDI portal
Publication:1781162
DOI10.1214/009053604000000959zbMath1064.62075arXivmath/0504511OpenAlexW2097671030MaRDI QIDQ1781162
Publication date: 23 June 2005
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0504511
Density estimation (62G07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15) Empirical decision procedures; empirical Bayes procedures (62C12)
Related Items
Nonparametric density estimation and bandwidth selection with B-spline bases: a novel Galerkin method, An empirical classification procedure for nonparametric mixture models, Optimal properties of centroid-based classifiers for very high-dimensional data, Asymptotic normality of plug-in level set estimates, Confidence regions for level sets, Nonparametric density estimation with nonuniform B-spline bases, Local nearest neighbour classification with applications to semi-supervised learning, Bayesian multiscale smoothing in supervised and semi-supervised kernel discriminant analysis, Unexpected properties of bandwidth choice when smoothing discrete data for constructing a functional data classifier, Nonparametric estimation of surface integrals on level sets, Kernel density classification for spherical data, Classification Using Censored Functional Data, Choice of neighbor order in nearest-neighbor classification, Kernel classification with missing data and the choice of smoothing parameters, On classification with nonignorable missing data, Asymptotics and optimal bandwidth for nonparametric estimation of density level sets, Outline analyses of the called strike zone in major league Baseball, On histogram-based regression and classification with incomplete data, Optimal weighted nearest neighbour classifiers, A stable hyperparameter selection for the Gaussian RBF kernel for discrimination, NONPARAMETRIC DENSITY ESTIMATION BY B-SPLINE DUALITY, Bias reduction in kernel binary regression, Adaptive smoothing in kernel discriminant analysis
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Local data-driven bandwidth choice for density estimation
- Analysis and optimization of Rosenblatt-Parsen classifier with the aid of asymptotic expansions
- Optimal rates of convergence to Bayes risk in nonparametric discrimination
- Large sample optimality of least squares cross-validation in density estimation
- An asymptotically optimal window selection rule for kernel density estimates
- Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates
- A local cross-validation algorithm
- Asymptotic bounds for the expected \(L^ 1\) error of a multivariate kernel density estimator
- An unsupervised and nonparametric classification procedure based on mixtures with known weights
- Smooth discrimination analysis
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Consistency of data-driven histogram methods for density estimation and classification
- A general lower bound on the number of examples needed for learning
- A distribution-free theory of nonparametric regression
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- On weak base hypotheses and their implications for boosting regression and classification
- Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Nonparametric Kernel Regression Estimation-Optimal Choice of Bandwidth
- Bias of Nearest Neighbor Error Estimates
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- Kernel classification rules from missing data
- On the posterior-probability estimate of the error rate of nonparametric classification rules
- Improvements on Cross-Validation: The .632+ Bootstrap Method
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Minimax nonparametric classification. II. Model selection for adaptation
- Minimax nonparametric classification .I. Rates of convergence
- NONPARAMETRIC CLASSIFICATION ON TWO UNIVARIATE DISTRIBUTIONS
- On the finite sample performance of the nearest neighbor classifier
- Classification Error for a Very Large Number of Classes
- Univariate Two-Population Distribution-free Discrimination
- Random forests