Rademacher penalties and structural risk minimization

From MaRDI portal
Revision as of 10:26, 7 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4544631

DOI10.1109/18.930926zbMath1008.62614OpenAlexW2087258353MaRDI QIDQ4544631

Vladimir I. Koltchinskii

Publication date: 4 August 2002

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/18.930926




Related Items (65)

Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.Deep learning: a statistical viewpointComplexity regularization via localized random penaltiesTikhonov, Ivanov and Morozov regularization for support vector machine learningOptimal aggregation of classifiers in statistical learning.Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)Some nonasymptotic results on resampling in high dimension. I: Confidence regionsThe Loss Rank Criterion for Variable Selection in Linear Regression AnalysisModel selection by bootstrap penalization for classificationClassifiers of support vector machine type with \(\ell_1\) complexity regularizationThe two-sample problem for Poisson processes: adaptive tests with a nonasymptotic wild bootstrap approachMatrixized learning machine with modified pairwise constraintsAnalysis of the generalization ability of a full decision treeMinimax rates for conditional density estimation via empirical entropyUnnamed ItemLearning bounds for quantum circuits in the agnostic settingAdaptive estimation of a distribution function and its density in sup-norm loss by wavelet and spline projectionsRobustness and generalizationGlobal uniform risk bounds for wavelet deconvolution estimatorsFrom Gauss to Kolmogorov: localized measures of complexity for ellipsesBootstrap model selection for possibly dependent and heterogeneous dataStatistical learning based on Markovian data maximal deviation inequalities and learning ratesHONEST CONFIDENCE SETS IN NONPARAMETRIC IV REGRESSION AND OTHER ILL-POSED MODELSA Hilbert Space Embedding for DistributionsOptimal model selection in heteroscedastic regression using piecewise polynomial functionsModel selection by resampling penalizationPenalized empirical risk minimization over Besov spacesAn improved analysis of the Rademacher data-dependent bound using its self bounding propertyVote counting measures for ensemble classifiers.On learning multicategory classification with sample queries.Combinatorial bounds of overfitting for threshold classifiersKernel methods in machine learningOn some distributions arising from a generalized trivariate reduction schemeSimultaneous adaptation to the margin and to complexity in classificationThe geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radiiA statistician teaches deep learningUnnamed ItemReducing mechanism design to algorithm design via machine learningDouble-fold localized multiple matrixized learning machineStructural multiple empirical kernel learningModel selection with the loss rank principleEstimation from nonlinear observations via convex programming with application to bilinear regressionSurrogate losses in passive and active learningApproximation of frame based missing data recoveryNon-asymptotic quality assessment of generalised FIR models with periodic inputsA survey of cross-validation procedures for model selectionEmpirical minimizationMoment inequalities for functions of independent random variablesUnnamed ItemA novel multi-view learning developed from single-view patternsModel selection in utility-maximizing binary predictionRademacher Chaos Complexities for Learning the Kernel ProblemBoosted ARTMAP: modifications to fuzzy ARTMAP motivated by boosting theoryA local Vapnik-Chervonenkis complexityMREKLM: a fast multiple empirical kernel learning machineTheory of Classification: a Survey of Some Recent AdvancesA permutation approach to validation*Optimal Bounds on Approximation of Submodular and XOS Functions by JuntasInference on covariance operators via concentration inequalities: \(k\)-sample tests, classification, and clustering via Rademacher complexitiesRademacher complexity in Neyman-Pearson classificationSquare root penalty: Adaption to the margin in classification and in edge estimationConvolutional spectral kernel learning with generalization guaranteesUnnamed ItemLocal Rademacher complexitiesCompressive sensing and neural networks from a statistical learning perspective






This page was built for publication: Rademacher penalties and structural risk minimization