@@ -105,7 +105,7 @@ elementary. As long as you have a well-behaved density function, we can
105
105
use it in the model to build the model log- likelihood function. Random
106
106
number generation is great to have, but sometimes there might not be
107
107
efficient random number generator for some densities. Since a function
108
- is all you need, you can wrap almost any thenao function into a
108
+ is all you need, you can wrap almost any Theano function into a
109
109
distribution using `` pm.DensityDist``
110
110
https:// docs.pymc.io/ Probability\_Distributions.html#custom-distributions
111
111
@@ -919,7 +919,7 @@ kernels in TFP do not flatten the tensors, see eg docstring of
919
919
Dynamic HMC
920
920
^^^^^^^^^^^
921
921
922
- We love NUTS , or to be more precise Dynamic HMC with complex stoping
922
+ We love NUTS , or to be more precise Dynamic HMC with complex stopping
923
923
rules. This part is actually all done outside of Theano, for NUTS , it
924
924
includes: the leapfrog, dual averaging, tunning of mass matrix and step
925
925
size, the tree building, sampler related statistics like divergence and
@@ -976,7 +976,7 @@ Multivariate Gaussian. In another word, we are approximating each elements in
976
976
super (ADVI , self ).__init__ (MeanField(* args, ** kwargs))
977
977
# ==> In the super class KLqp
978
978
super (KLqp, self ).__init__ (KL , MeanField(* args, ** kwargs), None , beta = beta)
979
- # ==> In the super class Inferece
979
+ # ==> In the super class Inference
980
980
...
981
981
self .objective = KL(MeanField(* args, ** kwargs))(None )
982
982
...
0 commit comments