Improving the sample complexity using global data
From MaRDI portal
Publication:4674555
DOI10.1109/TIT.2002.1013137zbMath1061.68128MaRDI QIDQ4674555
Publication date: 11 May 2005
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Sampling theory, sample surveys (62D05) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (21)
Deep learning: a statistical viewpoint ⋮ Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder) ⋮ Learning without concentration for general loss functions ⋮ Multi-kernel regularized classifiers ⋮ \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities ⋮ Unnamed Item ⋮ Classification with non-i.i.d. sampling ⋮ Learning Bounds for Kernel Regression Using Effective Data Dimensionality ⋮ Empirical minimization ⋮ SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming ⋮ Fast rates for support vector machines using Gaussian kernels ⋮ Oracle inequalities for support vector machines that are based on random entropy numbers ⋮ Approximating \(L_p\) unit balls via random sampling ⋮ Measuring the Capacity of Sets of Functions in the Analysis of ERM ⋮ Direct importance estimation for covariate shift adaptation ⋮ Theory of Classification: a Survey of Some Recent Advances ⋮ On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer ⋮ FAST RATES FOR ESTIMATION ERROR AND ORACLE INEQUALITIES FOR MODEL SELECTION ⋮ Fast generalization error bound of deep learning without scale invariance of activation functions ⋮ Local Rademacher complexities ⋮ Distribution-free robust linear regression
This page was built for publication: Improving the sample complexity using global data