Objective Bayesian inference with proper scoring rules
From MaRDI portal
Publication:2273176
Abstract: Standard Bayesian analyses can be difficult to perform when the full likelihood, and consequently the full posterior distribution, is too complex and difficult to specify or if robustness with respect to data or to model misspecifications is required. In these situations, we suggest to resort to a posterior distribution for the parameter of interest based on proper scoring rules. Scoring rules are loss functions designed to measure the quality of a probability distribution for a random variable, given its observed value. Important examples are the Tsallis score and the Hyv"arinen score, which allow us to deal with model misspecifications or with complex models. Also the full and the composite likelihoods are both special instances of scoring rules. The aim of this paper is twofold. Firstly, we discuss the use of scoring rules in the Bayes formula in order to compute a posterior distribution, named SR-posterior distribution, and we derive its asymptotic normality. Secondly, we propose a procedure for building default priors for the unknown parameter of interest that can be used to update the information provided by the scoring rule in the SR-posterior distribution. In particular, a reference prior is obtained by maximizing the average divergence from the SR-posterior distribution. For , the result is a Jeffreys-type prior that is proportional to the square root of the determinant of the Godambe information matrix associated to the scoring rule. Some examples are discussed.
Recommendations
Cites work
- scientific article; zbMATH DE number 5849508 (Why is no real title available?)
- scientific article; zbMATH DE number 5555137 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 1375577 (Why is no real title available?)
- scientific article; zbMATH DE number 3667770 (Why is no real title available?)
- scientific article; zbMATH DE number 3626416 (Why is no real title available?)
- scientific article; zbMATH DE number 3189754 (Why is no real title available?)
- A characterization of monotone and regular divergences
- A general divergence criterion for prior selection
- A parametric framework for the comparison of methods of very robust regression
- Adjusting composite likelihood ratio statistics
- An Optimum Property of Regular Maximum Likelihood Estimation
- An extended Gaussian max-stable process model for spatial extremes
- Approximate Bayesian computation with composite score functions
- Bayesian and frequentist confidence intervals arising from empirical-type likelihoods
- Bayesian composite marginal likelihoods
- Bayesian empirical likelihood
- Bayesian empirical likelihood for quantile regression
- Bayesian exponentially tilted empirical likelihood
- Bayesian inference from composite likelihoods, with an application to spatial extremes
- Bayesian information in an experiment and the Fisher information distance
- Bayesian model selection based on proper scoring rules
- Bootstrap adjustments of signed scoring rule root statistics
- Characterization of priors under which Bayesian and frequentist Bartlett corrections are equivalent in the multiparameter case
- Contrasting probabilistic scoring rules
- Default prior distributions from quasi- and quasi-profile likelihoods
- Estimation of non-normalized statistical models by score matching
- Higher-order asymptotics for scoring rules
- Inference for clustered data using the independence loglikelihood
- Minimum scoring rule inference
- Objective priors: an introduction for frequentists
- On divergence measures leading to Jeffreys and other reference priors
- Overall objective priors
- Possible generalization of Boltzmann-Gibbs statistics.
- Probability matching priors: Higher order asymptotics
- Proper local scoring rules
- Pseudo-Likelihoods for Bayesian Inference
- Quasi Bayesian likelihood
- Rejoinder: ``The case for objective Bayesian analysis
- Robust Bayes estimation using the density power divergence
- Robust Statistics
- Robust and efficient estimation by minimising a density power divergence
- Robust likelihood functions in Bayesian inference
- Some extensions of score matching
- The formal definition of reference priors
- The geometry of proper scoring rules
- Theory and applications of proper scoring rules
Cited in
(10)- Bayesian model selection based on proper scoring rules
- Objective Bayes, conditional inference and the signed root likelihood ratio statistic
- Recent advances in directional statistics
- Minimum scoring rule inference
- Generalized Bayesian Inference for Discrete Intractable Likelihood
- Generalized Bayesian likelihood-free inference
- On a class of objective priors from scoring rules (with discussion)
- Preposterior expected loss as a scoring rule for prior distributions
- Robust confidence distributions from proper scoring rules
- Robust inference for non-linear regression models from the Tsallis score: application to coronavirus disease 2019 Contagion in Italy
This page was built for publication: Objective Bayesian inference with proper scoring rules
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2273176)