-
-
Notifications
You must be signed in to change notification settings - Fork 61
Bug in get_domain_of_finite_discrete_rv
of Categorical
#331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hello @ricardoV94 @jessegrabowski ! Is this sill an issue? I would like to help :D I tried to reproduce the error but the example is incomplete. Nonetheless, I tried with the following adapted from this tutorial: import pymc as pm
import pytensor.tensor as pt
import pandas as pd
import numpy as np
import pymc_extras as pmx
rng = np.random.default_rng(32)
disaster_data = pd.Series(
[4, 5, 4, 0, 1, 4, 3, 4, 0, 6, 3, 3, 4, 0, 2, 6,
3, 3, 5, 4, 5, 3, 1, 4, 4, 1, 5, 5, 3, 4, 2, 5,
2, 2, 3, 4, 2, 1, 3, 0, 2, 1, 1, 1, 1, 3, 0, 0,
1, 0, 1, 1, 0, 0, 3, 1, 0, 3, 2, 2, 0, 1, 1, 1,
0, 1, 0, 1, 0, 0, 0, 2, 1, 0, 0, 0, 1, 1, 0, 2,
3, 3, 1, 0, 2, 1, 1, 1, 1, 2, 4, 2, 0, 0, 1, 4,
0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1]
)
# fmt: on
years = np.arange(1851, 1962)
with pm.Model() as disaster_model:
switchpoint = pm.DiscreteUniform("switchpoint", lower=years.min(), upper=years.max())
early_rate = pm.Exponential("early_rate", 1.0)
late_rate = pm.Exponential("late_rate", 1.0)
rate = pm.math.switch(switchpoint >= years, early_rate, late_rate)
disasters = pm.Poisson("disasters", rate, observed=disaster_data)
with disaster_model:
before_marg = pm.sample(random_seed=rng)
disaster_model_marginalized = pmx.marginalize(disaster_data, [switchpoint])
with disaster_model_marginalized:
after_marg = pm.sample(random_seed=rng) Some observations:
Am I doing something wrong? Error 1
|
This is actually a separate bug that is being tracked here. You can avoid it for now by setting |
Reported by @jessegrabowski
Instead of trying to get the vector length of p_param (which assumse p is always a vector), we should be constant folding
p_param.shape[-1]
.The text was updated successfully, but these errors were encountered: