Pages that link to "Item:Q3492635"
From MaRDI portal
The following pages link to Information-theoretic asymptotics of Bayes methods (Q3492635):
Displaying 50 items.
- Information optimality and Bayesian modelling (Q280206) (← links)
- Metabolic cost of neuronal information in an empirical stimulus-response model (Q353894) (← links)
- Moment matching priors (Q354215) (← links)
- A philosophical treatise of universal induction (Q400871) (← links)
- Coincidences and estimation of entropies of random variables with large cardinalities (Q400965) (← links)
- Objective priors: an introduction for frequentists (Q449810) (← links)
- Discussion on ``Objective priors: an introduction for frequentists'' by M. Ghosh (Q449813) (← links)
- Jeffreys versus Shtarkov distributions associated with some natural exponential families (Q537447) (← links)
- Bernstein-von Mises theorems for Gaussian regression with increasing number of regressors (Q661171) (← links)
- Reference optimality criterion for planning accelerated life testing (Q897620) (← links)
- On divergence measures leading to Jeffreys and other reference priors (Q899027) (← links)
- A general divergence criterion for prior selection (Q907083) (← links)
- Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss (Q947252) (← links)
- Consistency of discrete Bayesian learning (Q950201) (← links)
- Universal approximation of multi-copy states and universal quantum lossless data compression (Q981691) (← links)
- Flexible covariance estimation in graphical Gaussian models (Q1000308) (← links)
- A minimum description length approach to hidden Markov models with Poisson and Gaussian emissions. Application to order identification (Q1007478) (← links)
- Decision theoretic generalizations of the PAC model for neural net and other learning applications (Q1198550) (← links)
- Reference priors for prediction (Q1299366) (← links)
- Jeffreys' prior is asymptotically least favorable under entropy risk (Q1333128) (← links)
- On-line maximum likelihood prediction with respect to general loss functions (Q1370862) (← links)
- Mutual information, metric entropy and cumulative relative entropy risk (Q1383090) (← links)
- Information-theoretic determination of minimax rates of convergence (Q1578277) (← links)
- Combining different procedures for adaptive regression (Q1582634) (← links)
- Accuracy of latent-variable estimation in Bayesian semi-supervised learning (Q1669141) (← links)
- Asymptotic accuracy of Bayesian estimation for a single latent variable (Q1669142) (← links)
- Ensuring privacy with constrained additive noise by minimizing Fisher information (Q1716667) (← links)
- Minimum message length inference of the Poisson and geometric models using heavy-tailed prior distributions (Q1799691) (← links)
- Mixing strategies for density estimation. (Q1848770) (← links)
- Partial information reference priors: Derivation and interpretations (Q1877838) (← links)
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory (Q1879959) (← links)
- Information tradeoff (Q1906305) (← links)
- A Bernstein-von Mises theorem for discrete probability distributions (Q1951969) (← links)
- Reference priors for exponential families with increasing dimension (Q1952080) (← links)
- Shannon optimal priors on independent identically distributed statistical experiments converge weakly to Jeffrey's prior (Q1962695) (← links)
- A note on the confidence properties of reference priors for the calibration model (Q1962699) (← links)
- Comparative noninformativities of quantum priors based on monotone metrics (Q1966800) (← links)
- Price probabilities: a class of Bayesian and non-Bayesian prediction rules (Q2059056) (← links)
- When does ambiguity fade away? (Q2208852) (← links)
- Discrepancy risk model selection test theory for comparing possibly misspecified or nonnested models (Q2259869) (← links)
- Effects of additional data on Bayesian clustering (Q2292225) (← links)
- Special feature: Information theory and statistics (Q2303490) (← links)
- On universal prediction and Bayesian confirmation (Q2382281) (← links)
- Universal coding for classical-quantum channel (Q2391144) (← links)
- Stochastic complexity for mixture of exponential families in generalized variational Bayes (Q2465031) (← links)
- Entropy bounds on Bayesian learning (Q2468506) (← links)
- Nonsubjective priors via predictive relative entropy regret (Q2493559) (← links)
- An empirical study of minimum description length model selection with infinite parametric complexity (Q2507908) (← links)
- Predictability, Complexity, and Learning (Q2784814) (← links)
- The Newsvendor under Demand Ambiguity: Combining Data with Moment and Tail Information (Q2806068) (← links)