Local linear regression with reciprocal inverse Gaussian kernel (Q2312036)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Local linear regression with reciprocal inverse Gaussian kernel
scientific article

    Statements

    Local linear regression with reciprocal inverse Gaussian kernel (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    4 July 2019
    0 references
    The regression model \(Y=m(X)+\varepsilon\) is considered, with scalar response \(Y\) and scalar covariate \(X\) supported on \((0,~ \infty)\), where the residual satisfies standard assumptions \(E(\varepsilon|X=x)=0,~ E(\varepsilon^2|X=x)>0\) for almost all \(x\in (0,~ \infty)\) and \(m(x)=E(Y|X=x)\) is the regression function to be estimated. Based on the reciprocal inverse Gaussian kernel, local linear estimators for \(m(x)\) and \(m'(x)\) are constructed, such that the support of the estimators matches the support of \(X.\) The conditional mean-squared error of the estimators is derived, and their asymptotic properties are studied, including the asymptotic normality and the uniform a.s. convergence over any closed intervals in \((0,~ \infty)\). Similar to other nonparametric estimators, there is non-negligible bias in the asymptotic normality, and this bias can be dispelled by undersmoothing, i.e. to select the bandwidth such that \(nh^{5/2}\to 0.\) The finite sample performance of the proposed estimators is evaluated via simulation studies and a real data application. The proposed estimators are compared with the ones with normal kernel; the situations are indicated where the proposed estimators perform better and where they perform worse.
    0 references
    reciprocal inverse Gaussian
    0 references
    asymptotic normality
    0 references
    uniform almost sure convergence
    0 references
    local linear smoothers
    0 references

    Identifiers