You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Implemented get_first_level_conditionals to try to get rid of the added conditional_on attribute of every distribution. This function does a breadth first search on the node's logpt or transformed.logpt graph, looking for named nodes which are different from the root node, or the node's transformed, and is also not a TensorConstant or SharedVariable. Each branch was searched until the first named node was found. This way, the parent conditionals of the root searched node, which were only one step away from it in the bayesian network were returned. However, this ran into a problem with Mixture classes. These add to the logpt graph, another logpt graph from the comp_dists. This leads to the problem that the logpt's first level conditionals will also be seen as if they were first level conditional of the root. Furthermore, many copies of nodes done by the added logpt ended up being inserted into the computed conditional_on. This lead to a very strange error, in which loops appeared in the DAG, and depths started to be wrong. In particular, there were no depth 0 nodes. My view is that the explicit conditional_on attribute prevents problems like this one from happening, and so I left it as is, to discuss. Other changes done in this commit are that test_exact_step for the SMC uses draw_values on a hierarchy, and given that draw_values's behavior changed in the hierarchy situations, the exact trace values must also be adjusted. Finally test_bad_init was changed to run on one core, this way the parallel exception chaining does not change the exception type.
0 commit comments