Using the bootstrap to estimate mean squared error and select smoothing parameter in nonparametric problems (Q756327): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 2 users not shown)
Property / author
 
Property / author: Q188422 / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Estate V. Khmaladze / rank
Normal rank
 
Property / author
 
Property / author: Peter Gavin Hall / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Estate V. Khmaladze / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distribution function inequalities for martingales / rank
 
Normal rank
Property / cites work
 
Property / cites work: Kernel estimates of the tail index of a distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3670359 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Extent to which least-squares cross-validation minimises integrated square error in nonparametric density estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: A simple general approach to inference about the tail of a distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4085018 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4727203 / rank
 
Normal rank

Latest revision as of 14:04, 21 June 2024

scientific article
Language Label Description Also known as
English
Using the bootstrap to estimate mean squared error and select smoothing parameter in nonparametric problems
scientific article

    Statements

    Using the bootstrap to estimate mean squared error and select smoothing parameter in nonparametric problems (English)
    0 references
    1990
    0 references
    The paper is concerned with the clever idea of using bootstrap samples of essentially smaller size \(n_ 1\) than the size of the original sample n. More precisely, the author considers a bootstrap version \(f^*(\cdot | n_ 1,h_ 1)\) of the kernel density estimate \(\hat f(\cdot | n,h)\) and proves, in particular, that quantities like \[ (1)\quad E[\hat f(x| n_ 1,h_ 1)-f(x)]^ p,\text{ and } (2)\quad E[\hat f^*(x| n_ 1,h_ 1)-\hat f(x| n,h)]^ p \] (which obviously take account of both variance and bias of \(\hat f\) and \(\hat f^*)\) are close to each other for \(n_ 1<cn^{1-\delta}.\) The same statment concerning integral (in x) versions of (1) and (2) is proved. Some emphasis is on the problem of bootstrap estimation of a bias.
    0 references
    mean squared error
    0 references
    smoothing parameter
    0 references
    density estimation
    0 references
    bootstrap sample size
    0 references
    Lp-distances
    0 references
    nonparametric regression
    0 references
    tail parameter estimation
    0 references
    kernel density estimate
    0 references
    bootstrap estimation of a bias
    0 references
    0 references

    Identifiers