Conditional inferential models: combining information for prior-free probabilistic inference

From MaRDI portal
Publication:5379906

DOI10.1111/RSSB.12070zbMATH Open1414.62093arXiv1211.1530OpenAlexW1965975525MaRDI QIDQ5379906FDOQ5379906


Authors: Ryan Martin, Chuanhai Liu Edit this on Wikidata


Publication date: 14 June 2019

Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)

Abstract: The inferential model (IM) framework provides valid prior-free probabilistic inference by focusing on predicting unobserved auxiliary variables. But, efficient IM-based inference can be challenging when the auxiliary variable is of higher dimension than the parameter. Here we show that features of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM inference, and casts new light on Fisher's notions of sufficiency, conditioning, and also Bayesian inference. A differential equation-driven selection of a conditional association is developed, and validity of the conditional IM is proved under some conditions. For problems that do not admit a valid conditional IM of the standard form, we propose a more flexible class of conditional IMs based on localization. Examples of local conditional IMs in a bivariate normal model and a normal variance components model are also given.


Full work available at URL: https://arxiv.org/abs/1211.1530




Recommendations





Cited In (20)





This page was built for publication: Conditional inferential models: combining information for prior-free probabilistic inference

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5379906)