You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/howto/marginalizing-models.myst.md
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -12,9 +12,9 @@ kernelspec:
12
12
13
13
# Automatic marginalization of discrete variables
14
14
15
-
PyMC is very amendable to sampling models with discrete latent variables. But if you insist on using the NUTS sampler, you will need to get rid of your discrete variables somehow. The best way to do this is by marginalizing them out, as then you benefit from Rao-Blackwell's theorem and getting a lower variance estimate of your parameters.
15
+
PyMC is very amendable to sampling models with discrete latent variables. But if you insist on using the NUTS sampler, you will need to get rid of your discrete variables somehow. The best way to do this is by marginalizing them out, as then you benefit from Rao-Blackwell's theorem and get a lower variance estimate of your parameters.
16
16
17
-
Unfortunately, the computation to do this is often tedious and unintuitive. Luckily, `pmx` now has supports a way to do this work automatically!
17
+
Unfortunately, the computation to do this is often tedious and unintuitive. Luckily, `pmx` now supports a way to do this work automatically!
18
18
19
19
```{code-cell} ipython3
20
20
import arviz as az
@@ -45,7 +45,7 @@ with pmx.MarginalModel() as mixture_model:
45
45
y = pm.Normal("y", mu=mu[idx], sigma=1.0)
46
46
```
47
47
48
-
As we can see there are already two ways to specify the same one. One where the choice of mixture is explicit, and the other where we use built-in `NormalMixture` to have a model where choice is not our model. There is nothing unique about the first model other than we initialize with `pmx.MarginalModel` instead of `pm.Model`. This different class is what will allow us to marginalize out variables later.
48
+
As we can see there are already two ways to specify the same model. One where the choice of mixture is explicit, and the other where we use the built-in `NormalMixture`distribution to where that choice is not our model. There is nothing unique about the first model other than we initialize it with `pmx.MarginalModel` instead of `pm.Model`. This different class is what will allow us to marginalize out variables later.
One important thing to notice is that this discrete variable has a lower ESS, and particularly for the tail. This means `idx` might not be estimated well particularly for the tails. If this is important, I recommend using the `lp_idx` instead, which is the log-probability of `idx` given sample values on that iteration.
116
+
One important thing to notice is that this discrete variable has a lower ESS, and particularly so for the tail. This means `idx` might not be estimated well particularly for the tails. If this is important, I recommend using the `lp_idx` instead, which is the log-probability of `idx` given sample values on each iteration.
0 commit comments