Rademacher penalties and structural risk minimization
From MaRDI portal
Publication:4544631
DOI10.1109/18.930926zbMath1008.62614OpenAlexW2087258353MaRDI QIDQ4544631
Publication date: 4 August 2002
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.930926
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonparametric estimation (62G05) Production models (90B30)
Related Items (65)
Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. ⋮ Deep learning: a statistical viewpoint ⋮ Complexity regularization via localized random penalties ⋮ Tikhonov, Ivanov and Morozov regularization for support vector machine learning ⋮ Optimal aggregation of classifiers in statistical learning. ⋮ Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder) ⋮ Some nonasymptotic results on resampling in high dimension. I: Confidence regions ⋮ The Loss Rank Criterion for Variable Selection in Linear Regression Analysis ⋮ Model selection by bootstrap penalization for classification ⋮ Classifiers of support vector machine type with \(\ell_1\) complexity regularization ⋮ The two-sample problem for Poisson processes: adaptive tests with a nonasymptotic wild bootstrap approach ⋮ Matrixized learning machine with modified pairwise constraints ⋮ Analysis of the generalization ability of a full decision tree ⋮ Minimax rates for conditional density estimation via empirical entropy ⋮ Unnamed Item ⋮ Learning bounds for quantum circuits in the agnostic setting ⋮ Adaptive estimation of a distribution function and its density in sup-norm loss by wavelet and spline projections ⋮ Robustness and generalization ⋮ Global uniform risk bounds for wavelet deconvolution estimators ⋮ From Gauss to Kolmogorov: localized measures of complexity for ellipses ⋮ Bootstrap model selection for possibly dependent and heterogeneous data ⋮ Statistical learning based on Markovian data maximal deviation inequalities and learning rates ⋮ HONEST CONFIDENCE SETS IN NONPARAMETRIC IV REGRESSION AND OTHER ILL-POSED MODELS ⋮ A Hilbert Space Embedding for Distributions ⋮ Optimal model selection in heteroscedastic regression using piecewise polynomial functions ⋮ Model selection by resampling penalization ⋮ Penalized empirical risk minimization over Besov spaces ⋮ An improved analysis of the Rademacher data-dependent bound using its self bounding property ⋮ Vote counting measures for ensemble classifiers. ⋮ On learning multicategory classification with sample queries. ⋮ Combinatorial bounds of overfitting for threshold classifiers ⋮ Kernel methods in machine learning ⋮ On some distributions arising from a generalized trivariate reduction scheme ⋮ Simultaneous adaptation to the margin and to complexity in classification ⋮ The geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radii ⋮ A statistician teaches deep learning ⋮ Unnamed Item ⋮ Reducing mechanism design to algorithm design via machine learning ⋮ Double-fold localized multiple matrixized learning machine ⋮ Structural multiple empirical kernel learning ⋮ Model selection with the loss rank principle ⋮ Estimation from nonlinear observations via convex programming with application to bilinear regression ⋮ Surrogate losses in passive and active learning ⋮ Approximation of frame based missing data recovery ⋮ Non-asymptotic quality assessment of generalised FIR models with periodic inputs ⋮ A survey of cross-validation procedures for model selection ⋮ Empirical minimization ⋮ Moment inequalities for functions of independent random variables ⋮ Unnamed Item ⋮ A novel multi-view learning developed from single-view patterns ⋮ Model selection in utility-maximizing binary prediction ⋮ Rademacher Chaos Complexities for Learning the Kernel Problem ⋮ Boosted ARTMAP: modifications to fuzzy ARTMAP motivated by boosting theory ⋮ A local Vapnik-Chervonenkis complexity ⋮ MREKLM: a fast multiple empirical kernel learning machine ⋮ Theory of Classification: a Survey of Some Recent Advances ⋮ A permutation approach to validation* ⋮ Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas ⋮ Inference on covariance operators via concentration inequalities: \(k\)-sample tests, classification, and clustering via Rademacher complexities ⋮ Rademacher complexity in Neyman-Pearson classification ⋮ Square root penalty: Adaption to the margin in classification and in edge estimation ⋮ Convolutional spectral kernel learning with generalization guarantees ⋮ Unnamed Item ⋮ Local Rademacher complexities ⋮ Compressive sensing and neural networks from a statistical learning perspective
This page was built for publication: Rademacher penalties and structural risk minimization