Gaining degrees of freedom in subsymbolic learning (Q5941074): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
 
(One intermediate revision by one other user not shown)
Property / cites work
 
Property / cites work: Q4737600 / rank
 
Normal rank
Property / cites work
 
Property / cites work: PAC learning of concept classes through the boundaries of their items / rank
 
Normal rank
Property / cites work
 
Property / cites work: Learnability and the Vapnik-Chervonenkis dimension / rank
 
Normal rank
Property / cites work
 
Property / cites work: A general lower bound on the number of examples needed for learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5538132 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tracking drifting concepts by minimizing disagreements / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient noise-tolerant learning from statistical queries / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4755588 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Non-Parametric Estimation II. Statistically Equivalent Blocks and Tolerance Regions--The Continuous Case / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonparametric Estimation, III. Statistically Equivalent Blocks and Multivariate Tolerance Regions--The Discontinuous Case / rank
 
Normal rank
Property / cites work
 
Property / cites work: A theory of the learnable / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4856771 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4023358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some special Vapnik-Chervonenkis classes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5561562 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/s0304-3975(99)00289-3 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2058481336 / rank
 
Normal rank

Latest revision as of 11:02, 30 July 2024

scientific article; zbMATH DE number 1635239
Language Label Description Also known as
English
Gaining degrees of freedom in subsymbolic learning
scientific article; zbMATH DE number 1635239

    Statements

    Gaining degrees of freedom in subsymbolic learning (English)
    0 references
    0 references
    0 references
    20 August 2001
    0 references
    We provide some theoretical results on sample complexity of PAC learning when the hypotheses are given by subsymbolical devices such as neural networks. In this framework we give new foundations to the notion of degrees of freedom of a statistic and relate it to the complexity of a concept class. Thus, for a given concept class and a given sample size, we discuss the efficiency of subsymbolical learning algorithms in terms of degrees of freedom of the computed statistic. In this setting we appraise the sample complexity overhead coming from relying on approximate hypotheses and display an increase in the degrees of freedom yield by embedding available formal knowledge into the algorithm. For known sample distribution, these quantities are related to the learning approximation goal and a special production prize is shown. Finally, we prove that testing the approximation capability of a neural network generally demands smaller sample size than training it.
    0 references
    computational learning
    0 references
    sentry functions
    0 references
    nested concept classes
    0 references
    approximate learning
    0 references
    neural networks
    0 references

    Identifiers