Smooth discrimination analysis
DOI10.1214/AOS/1017939240zbMATH Open0961.62058OpenAlexW2049393399MaRDI QIDQ1583889FDOQ1583889
Authors: Enno Mammen, Alexandre B. Tsybakov
Publication date: 5 June 2001
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1017939240
Recommendations
- Smoothed analysis. Motivation and discrete models
- Smoothing for Discrete Kernels in Discrimination
- scientific article; zbMATH DE number 2163921
- scientific article; zbMATH DE number 1431771
- Smoothed analysis of condition numbers
- scientific article; zbMATH DE number 1962932
- Smoothed detrended fluctuation analysis
- scientific article; zbMATH DE number 1369661
Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian problems; characterization of Bayes procedures (62C10)
Cites Work
- Asymptotical minimax recovery of sets with smooth boundaries
- On nonparametric estimation of density level sets
- Title not available (Why is that?)
- Risk bounds for model selection via penalization
- Title not available (Why is that?)
- Measuring mass concentrations and estimating density contour clusters -- An excess mass approach
- Excess Mass Estimates and Tests for Multimodality
- Estimation of a Convex Density Contour in Two Dimensions
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Minimax theory of image reconstruction
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Estimation des densit�s: risque minimax
- Title not available (Why is that?)
- Rates of convergence for minimum contrast estimators
- Nonparametric regression under qualitative smoothness assumptions
- Title not available (Why is that?)
- Approximation and estimation bounds for artificial neural networks
- Metric entropy of some classes of sets with differentiable boundaries
- The method of sieves and minimum contrast estimators
- Optimal rates of convergence to Bayes risk in nonparametric discrimination
- Monotone Discriminant Functions and Their Applications in Rheumatology
- Title not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Rates of Convergence for a Bayesian Level Set Estimation
- Lower bounds for the rate of convergence in nonparametric pattern recognition
- 1-bit matrix completion: PAC-Bayesian analysis of a variational approximation
- Fast learning rates in statistical inference through aggregation
- Choice of neighbor order in nearest-neighbor classification
- Methods for estimation of convex sets
- Adaptive transfer learning
- The multi-armed bandit problem with covariates
- Model selection by resampling penalization
- A high-dimensional Wilks phenomenon
- Posterior contraction and credible regions for level sets
- Local nearest neighbour classification with applications to semi-supervised learning
- Oracle inequalities for cross-validation type procedures
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Nonparametric estimation of surface integrals on level sets
- Risk bounds for CART classifiers under a margin condition
- Bandwidth choice for nonparametric classification
- Margin-adaptive model selection in statistical learning
- Convergence rates of generalization errors for margin-based classification
- Title not available (Why is that?)
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Nonasymptotic bounds for vector quantization in Hilbert spaces
- Fast learning rates for plug-in classifiers
- Best subset binary prediction
- Estimating conditional quantiles with the help of the pinball loss
- Rates of convergence for robust geometric inference
- PAC-Bayesian high dimensional bipartite ranking
- Local Rademacher complexities
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- PAC-Bayesian bounds for randomized empirical risk minimizers
- Fast rates for support vector machines using Gaussian kernels
- Deviation inequalities for random polytopes in arbitrary convex bodies
- Adaptation to lowest density regions with application to support recovery
- Surrogate losses in passive and active learning
- Optimal aggregation of classifiers in statistical learning.
- Robust estimation of \(U\)-statistics
- Neyman-Pearson classification: parametrics and sample size requirement
- Bandwidth selection for kernel density estimators of multivariate level sets and highest density regions
- Large‐margin classification with multiple decision rules
- Randomized allocation with arm elimination in a bandit problem with covariates
- On the rates of convergence of simulation-based optimization algorithms for optimal stopping problems
- Title not available (Why is that?)
- Robust empirical Bayes tests for continuous distributions
- Optimal rates of convergence to Bayes risk in nonparametric discrimination
- Rates of convergence in active learning
- Minimax semi-supervised set-valued approach to multi-class classification
- Title not available (Why is that?)
- Inverse statistical learning
- Confidence regions for level sets
- Minimax fast rates for discriminant analysis with errors in variables
- Pricing Bermudan options by nonparametric regression: optimal rates of convergence for lower estimates
- Simultaneous adaptation to the margin and to complexity in classification
- Optimal rates for plug-in estimators of density level sets
- Risk bounds for statistical learning
- An adaptive multiclass nearest neighbor classifier
- Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule
- Optimal rates of aggregation in classification under low noise assumption
- A convex optimization approach to high-dimensional sparse quadratic discriminant analysis
- Woodroofe's one-armed bandit problem revisited
- Performance guarantees for individualized treatment rules
- Classification via local multi-resolution projections
- Variance-based regularization with convex objectives
- Optimal weighted nearest neighbour classifiers
- Generalized density clustering
- Regularization in statistics
- Theory of Classification: a Survey of Some Recent Advances
- Model selection by bootstrap penalization for classification
- Adaptive Algorithm for Multi-Armed Bandit Problem with High-Dimensional Covariates
- Title not available (Why is that?)
- Title not available (Why is that?)
- Convergence rates of deep ReLU networks for multiclass classification
- Learning the distribution of latent variables in paired comparison models with round-robin scheduling
- Title not available (Why is that?)
- Convergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesics
- Optimal linear discriminators for the discrete choice model in growing dimensions
- Hold-out estimates of prediction models for Markov processes
- Optimal rates for nonparametric F-score binary classification via post-processing
- Debiased machine learning of set-identified linear models
- On regression and classification with possibly missing response variables in the data
- Deep neural network classifier for multidimensional functional data
- ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels
- Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
- Optimal classification for functional data
- Optimal third root asymptotic bounds in the statistical estimation of thresholds
- Intrinsic Dimension Adaptive Partitioning for Kernel Methods
- Nonparametric classification with missing data
- Rejoinder: New Objectives for Policy Learning
- Review on functional data classification
- A kernel-type regression estimator for NMAR response variables with applications to classification
- Structured kernel quantile regression
- Noisy discriminant analysis with boundary assumptions
- Transfer learning for contextual multi-armed bandits
- Title not available (Why is that?)
- Optimal convergence rates of deep neural networks in a classification setting
- Robust statistical learning with Lipschitz and convex loss functions
- Multiclass classification for multidimensional functional data through deep neural networks
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- A reduced-rank approach to predicting multiple binary responses through machine learning
- Functional data analysis using deep neural networks
- Fast convergence rates of deep neural networks for classification
This page was built for publication: Smooth discrimination analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1583889)