Nonparametric learning from Bayesian models with randomized objective functions (Q70214): Difference between revisions
From MaRDI portal
Created claim: summary (P1638): This article discusses a non-parametric Bayesian approach to update a model parameter's posterior distribution given data. It involves minimizing the KL divergence between the true data-generating mechanism and a parametric family $\mathcal{F}_{\Theta}$ for a parameter $\theta$. A mixture of Dirichlet processes (MDP) prior, $[F|\theta] \sim \text{DP}(c, f_\theta(\cdot)); \quad \theta \sim \pi(\theta)$, is used to specify beliefs about the true... |
Added qualifier: generated by (P1642): Nemotron (Q6516541) |
||
Property / summary: This article discusses a non-parametric Bayesian approach to update a model parameter's posterior distribution given data. It involves minimizing the KL divergence between the true data-generating mechanism and a parametric family $\mathcal{F}_{\Theta}$ for a parameter $\theta$. A mixture of Dirichlet processes (MDP) prior, $[F|\theta] \sim \text{DP}(c, f_\theta(\cdot)); \quad \theta \sim \pi(\theta)$, is used to specify beliefs about the true mechanism $F_0$. Given data $x_{1:n}$, the posterior update for $F$ utilizes the DP's conjugacy. The process ultimately aims to inform the posterior distribution of $\theta$, $\tilde{\pi}(\theta|x_{1:n})$, through integration over possible $F$. (English) / qualifier | |||
Revision as of 12:26, 25 November 2024
scientific article from arXiv
Language | Label | Description | Also known as |
---|---|---|---|
English | Nonparametric learning from Bayesian models with randomized objective functions |
scientific article from arXiv |
Statements
29 June 2018
0 references
stat.ML
0 references
cs.LG
0 references
stat.ME
0 references
Understanding Uncertainty with Math: Imagine you have a math model that tries to guess how some data was created. To make the model better, you update its "best guess" (called a parameter) based on new data. This process uses a special tool called a "mixture of Dirichlet processes" to handle uncertainty about the true source of the data. When new data comes in, this tool helps adjust the model's guess by balancing old assumptions with new information. Essentially, it's a way to refine a math model's accuracy by embracing and updating its uncertainties as more data becomes available. (English)
0 references
This article discusses a non-parametric Bayesian approach to update a model parameter's posterior distribution given data. It involves minimizing the KL divergence between the true data-generating mechanism and a parametric family $\mathcal{F}_{\Theta}$ for a parameter $\theta$. A mixture of Dirichlet processes (MDP) prior, $[F|\theta] \sim \text{DP}(c, f_\theta(\cdot)); \quad \theta \sim \pi(\theta)$, is used to specify beliefs about the true mechanism $F_0$. Given data $x_{1:n}$, the posterior update for $F$ utilizes the DP's conjugacy. The process ultimately aims to inform the posterior distribution of $\theta$, $\tilde{\pi}(\theta|x_{1:n})$, through integration over possible $F$. (English)
0 references