You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are now ready to make inferences about our HMM model with PyMC. We will define priors for each model parameter and use {class}`~pymc.Potential` to add the joint log-likelihood term to our model.
481
481
482
482
```{code-cell} ipython3
483
-
with pm.Model(rng_seeder=int(rng.integers(2**30))) as model:
Before we start sampling, we check the logp of each variable at the model initial point. Bugs tend to manifest themselves in the form of `nan` or `-inf` for the initial probabilities.
516
516
517
517
```{code-cell} ipython3
518
-
initial_point = model.compute_initial_point()
518
+
initial_point = model.initial_point()
519
519
initial_point
520
520
```
521
521
@@ -604,7 +604,7 @@ jax_fn()
604
604
We can also compile a JAX function that computes the log probability of each variable in our PyMC model, similar to {meth}`~pymc.Model.point_logps`. We will use the helper method {meth}`~pymc.Model.compile_fn`.
0 commit comments