The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach (Q2359161): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1786325621 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1408.1239 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Dual divergence estimators and tests: robustness results / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust and efficient estimation by minimising a density power divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum disparity estimation for continuous models: Efficiency, distributions and robustness / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistical Inference / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum Hellinger distance estimates for parametric models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3687500 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5580053 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy differential metric, distance and divergence measures in probability spaces: A unified approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5737389 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The ``automatic'' robustness of minimum distance functionals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust estimation for independent non-homogeneous observations using density power divergence with applications to linear regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Bayes estimation using the density power divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the robustness of a divergence based test of simple statistical hypotheses / rank
 
Normal rank
Property / cites work
 
Property / cites work: A generalized divergence for statistical inference / rank
 
Normal rank
Property / cites work
 
Property / cites work: A comparison of related density-based minimum divergence estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Selection of tuning parameters in bridge regression models via Bayesian information criterion / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Information and Sufficiency / rank
 
Normal rank
Property / cites work
 
Property / cites work: Divergence-based estimation and testing with misclassified data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Consistency of minimizing a penalized density power divergence estimator for mixing distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficiency versus robustness: The case for minimum Hellinger distance and related methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling / rank
 
Normal rank
Property / cites work
 
Property / cites work: Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two approaches to grouping of data and related disparity statistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Preliminary phi-divergence test estimators for linear restrictions in a logistic regression model / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Blind Source Separation by Beta Divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum \(\phi\)-divergence estimator in logistic regression models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria / rank
 
Normal rank
Property / cites work
 
Property / cites work: The power divergence and the density power divergence families: the mathematical connection / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust asymptotic statistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum Hellinger Distance Estimation for the Analysis of Count Data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Do robust estimators work with real data? / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum Hellinger Distance Estimation for Multivariate Location and Covariance / rank
 
Normal rank
Property / cites work
 
Property / cites work: Choosing a robustness tuning parameter / rank
 
Normal rank
Property / cites work
 
Property / cites work: Consistency of the kernel density estimator: a survey / rank
 
Normal rank

Latest revision as of 00:33, 14 July 2024

scientific article
Language Label Description Also known as
English
The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach
scientific article

    Statements

    The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach (English)
    0 references
    0 references
    0 references
    27 June 2017
    0 references
    minimum \(S\)-divergence estimator
    0 references
    robustness
    0 references
    continuous model
    0 references
    Basu-Lindsay approach
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers