Estimating a smooth monotone regression function (Q1175381): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Import recommendations run Q6534273
 
(2 intermediate revisions by 2 users not shown)
Property / DOI
 
Property / DOI: 10.1214/aos/1176348117 / rank
Normal rank
 
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1214/aos/1176348117 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2034818769 / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1214/AOS/1176348117 / rank
 
Normal rank
Property / Recommended article
 
Property / Recommended article: A simple nonparametric estimator of a strictly monotone regression function / rank
 
Normal rank
Property / Recommended article: A simple nonparametric estimator of a strictly monotone regression function / qualifier
 
Similarity Score: 0.82176167
Amount0.82176167
Unit1
Property / Recommended article: A simple nonparametric estimator of a strictly monotone regression function / qualifier
 
Property / Recommended article
 
Property / Recommended article: A comparative study of monotone nonparametric kernel estimates / rank
 
Normal rank
Property / Recommended article: A comparative study of monotone nonparametric kernel estimates / qualifier
 
Similarity Score: 0.81292725
Amount0.81292725
Unit1
Property / Recommended article: A comparative study of monotone nonparametric kernel estimates / qualifier
 
Property / Recommended article
 
Property / Recommended article: Nonparametric kernel regression subject to monotonicity constraints / rank
 
Normal rank
Property / Recommended article: Nonparametric kernel regression subject to monotonicity constraints / qualifier
 
Similarity Score: 0.8110243
Amount0.8110243
Unit1
Property / Recommended article: Nonparametric kernel regression subject to monotonicity constraints / qualifier
 
Property / Recommended article
 
Property / Recommended article: Recent progress in the nonparametric estimation of monotone curves -- with applications to bioassay and environmental risk assessment / rank
 
Normal rank
Property / Recommended article: Recent progress in the nonparametric estimation of monotone curves -- with applications to bioassay and environmental risk assessment / qualifier
 
Similarity Score: 0.80224633
Amount0.80224633
Unit1
Property / Recommended article: Recent progress in the nonparametric estimation of monotone curves -- with applications to bioassay and environmental risk assessment / qualifier
 
Property / Recommended article
 
Property / Recommended article: Isotonic regression under Lipschitz constraint / rank
 
Normal rank
Property / Recommended article: Isotonic regression under Lipschitz constraint / qualifier
 
Similarity Score: 0.79637843
Amount0.79637843
Unit1
Property / Recommended article: Isotonic regression under Lipschitz constraint / qualifier
 
Property / Recommended article
 
Property / Recommended article: Sharp asymptotics for isotonic regression / rank
 
Normal rank
Property / Recommended article: Sharp asymptotics for isotonic regression / qualifier
 
Similarity Score: 0.78540075
Amount0.78540075
Unit1
Property / Recommended article: Sharp asymptotics for isotonic regression / qualifier
 

Latest revision as of 20:48, 27 January 2025

scientific article
Language Label Description Also known as
English
Estimating a smooth monotone regression function
scientific article

    Statements

    Estimating a smooth monotone regression function (English)
    0 references
    0 references
    25 June 1992
    0 references
    The problem of estimating a smooth monotone regression function \(m\) is studied. Two estimators \(m_{SI}\) and \(m_{IS}\) are compared. \(m_{SI}\) consists of two steps: (i) smoothing of the data by the kernel estimator, (ii) isotonisation of the data by the pool adjacent violator algorithm. The estimator \(m_{IS}\) is constructed by interchanging these two steps. The author considers the asymptotic behaviour of these estimators at a fixed point \(x_ 0\) where the function \(m\) is assumed to be strictly monotone and smooth. It is shown that if the bandwidth of the kernel estimator is chosen in the optimal order \(n^{-1/5}\), \(m_{SI}(x_ 0)\) and \(m_{IS}(x_ 0)\) are of order \(n^{-2/5}\) and that they are asymptotically equivalent in first order. But \(m_{SI}(x_ 0)- m_{IS}(x_ 0)\) is of the only slightly lower order \(n^{-8/15}.\) Theorem 3 of the paper deals with stochastic higher order expansions for \(m_{SI}(x_ 0)\) and \(m_{IS}(x_ 0)\). These expansions entail that \(m_{IS}(x_ 0)\) has always a smaller variance and a larger bias than \(m_{SI}(x_ 0)\). Furthermore it is shown that the kernel function \(K\) of the chosen kernel estimator mainly determines whether one should prefer the estimator \(m_{SI}\) or \(m_{IS}\). If the bandwidth of the kernel estimator \(m_ S\) is chosen such that the mean square error is asymptotically minimized, then \(m_{IS}(x_ 0)\) has asymptotically smaller mean square error than \(m_{SI}(x_ 0)\) if and only if \[ \int K^ 2(t)dt [\int t^ 2K(t)dt\int K'(t)^ 2dt ]^{-1} \] is smaller than a universal constant. For related literature see \textit{K. Cheng} and \textit{P. Lin}, Z. Wahrscheinlichkeitstheor. Verw. Geb. 57, 223-233 (1981; Zbl 0443.62029), and \textit{R. E. Barlow}, \textit{D. J. Bartholomew}, \textit{J. M. Bremner} and \textit{H. D. Brunk}, Statistical inference under order restrictions. The theory and application of isotonic regression (1972; Zbl 0246.62038).
    0 references
    nonparametric regression
    0 references
    isotonic regression
    0 references
    estimating a smooth monotone regression function
    0 references
    kernel estimator
    0 references
    pool adjacent violator algorithm
    0 references
    bandwidth
    0 references
    stochastic higher order expansions
    0 references
    mean squared error
    0 references

    Identifiers