Local entropy in learning theory (Q2460502)

From MaRDI portal
Revision as of 09:36, 21 February 2024 by RedirectionBot (talk | contribs) (‎Changed an Item)
scientific article
Language Label Description Also known as
English
Local entropy in learning theory
scientific article

    Statements

    Local entropy in learning theory (English)
    0 references
    12 November 2007
    0 references
    Let \((S, \tau)\) be a metric space and let \(c >1\) be some fixed constant. For \(\varepsilon >0\), the local packing number is the quantity \[ {\overline P}_\varepsilon(c, A, S):=\sup \{ n: \exists x_1, \ldots x_n \in A, \quad \varepsilon \leq \tau (x_i, x_j) \leq c\, \varepsilon \}. \] Let further \(X={\mathbb R}^m\), \(Y=[-M,M]\), \(Z=X\times Y\) and suppose that \(\rho\) is some probability measure on \(Z\). The purpose of the paper is to approximate the regression function \(f_\rho(x)=\int_Y y\,d\rho(y| x)\), where \(\rho(y| x)\) is the conditional probability measure. It is assumed that a Borel probability measure \(\mu\) on \(X\) is fixed and that a set \(\Theta\) of admissible Borel functions is given. Under these conditions, the existence of an estimator \(f_z\) is proved for which a certain upper bound for the probability \(\rho^m\{z:\| f_z-f_\rho\| _{L_2(\mu)} >\eta \}\) holds for all \(\eta>0\). The bound is expressed in terms of the local packing number \({\overline P}_\varepsilon(20, \Theta, L_2(\mu)\).
    0 references
    0 references
    0 references
    0 references
    0 references
    entropy
    0 references
    learning
    0 references
    accuracy confidence
    0 references
    0 references