Skip to content

Commit 1ead0a6

Browse files
authored
Use float dtype for numpyro (#418)
1 parent 12d2294 commit 1ead0a6

File tree

1 file changed

+9
-18
lines changed

1 file changed

+9
-18
lines changed

lectures/bayes_nonconj.md

Lines changed: 9 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
---
22
jupytext:
33
text_representation:
4-
extension: .myst
4+
extension: .md
55
format_name: myst
66
format_version: 0.13
7-
jupytext_version: 1.13.8
7+
jupytext_version: 1.16.4
88
kernelspec:
99
display_name: Python 3 (ipykernel)
1010
language: python
@@ -43,7 +43,6 @@ The two Python modules are
4343

4444
As usual, we begin by importing some Python code.
4545

46-
4746
```{code-cell} ipython3
4847
:tags: [hide-output]
4948
@@ -80,10 +79,8 @@ from numpyro.infer import SVI as nSVI
8079
from numpyro.infer import ELBO as nELBO
8180
from numpyro.infer import Trace_ELBO as nTrace_ELBO
8281
from numpyro.optim import Adam as nAdam
83-
8482
```
8583

86-
8784
## Unleashing MCMC on a Binomial Likelihood
8885

8986
This lecture begins with the binomial example in the {doc}`quantecon lecture <prob_meaning>`.
@@ -252,7 +249,6 @@ We will use the following priors:
252249

253250
- The truncated Laplace can be created using `Numpyro`'s `TruncatedDistribution` class.
254251

255-
256252
```{code-cell} ipython3
257253
# used by Numpyro
258254
def TruncatedLogNormal_trans(loc, scale):
@@ -560,19 +556,17 @@ class BayesianInference:
560556
Computes numerically the posterior distribution with beta prior parametrized by (alpha0, beta0)
561557
given data using MCMC
562558
"""
563-
# tensorize
564-
data = torch.tensor(data)
565-
566559
# use pyro
567560
if self.solver=='pyro':
568-
561+
# tensorize
562+
data = torch.tensor(data)
569563
nuts_kernel = NUTS(self.model)
570564
mcmc = MCMC(nuts_kernel, num_samples=num_samples, warmup_steps=num_warmup, disable_progbar=True)
571565
mcmc.run(data)
572566
573567
# use numpyro
574568
elif self.solver=='numpyro':
575-
569+
data = np.array(data, dtype=float)
576570
nuts_kernel = nNUTS(self.model)
577571
mcmc = nMCMC(nuts_kernel, num_samples=num_samples, num_warmup=num_warmup, progress_bar=False)
578572
mcmc.run(self.rng_key, data=data)
@@ -655,15 +649,15 @@ class BayesianInference:
655649
params : the learned parameters for guide
656650
losses : a vector of loss at each step
657651
"""
658-
# tensorize data
659-
if not torch.is_tensor(data):
660-
data = torch.tensor(data)
661652
662653
# initiate SVI
663654
svi = self.SVI_init(guide_dist=guide_dist)
664655
665656
# do gradient steps
666657
if self.solver=='pyro':
658+
# tensorize data
659+
if not torch.is_tensor(data):
660+
data = torch.tensor(data)
667661
# store loss vector
668662
losses = np.zeros(n_steps)
669663
for step in range(n_steps):
@@ -676,6 +670,7 @@ class BayesianInference:
676670
}
677671
678672
elif self.solver=='numpyro':
673+
data = np.array(data, dtype=float)
679674
result = svi.run(self.rng_key, n_steps, data, progress_bar=False)
680675
params = dict(
681676
(key, np.asarray(value)) for key, value in result.params.items()
@@ -898,7 +893,6 @@ For the same Beta prior, we shall
898893
899894
Let's start with the analytical method that we described in this quantecon lecture <https://python.quantecon.org/prob_meaning.html>
900895
901-
902896
```{code-cell} ipython3
903897
# First examine Beta priors
904898
BETA_pyro = BayesianInference(param=(5,5), name_dist='beta', solver='pyro')
@@ -952,12 +946,10 @@ will be more accurate, as we shall see next.
952946
953947
(Increasing the step size increases computational time though).
954948
955-
956949
```{code-cell} ipython3
957950
BayesianInferencePlot(true_theta, num_list, BETA_numpyro).SVI_plot(guide_dist='beta', n_steps=100000)
958951
```
959952
960-
961953
## Non-conjugate Prior Distributions
962954
963955
Having assured ourselves that our MCMC and VI methods can work well when we have conjugate prior and so can also compute analytically, we
@@ -1052,7 +1044,6 @@ SVI_num_steps = 50000
10521044
example_CLASS = BayesianInference(param=(0,1), name_dist='uniform', solver='numpyro')
10531045
print(f'=======INFO=======\nParameters: {example_CLASS.param}\nPrior Dist: {example_CLASS.name_dist}\nSolver: {example_CLASS.solver}')
10541046
BayesianInferencePlot(true_theta, num_list, example_CLASS).SVI_plot(guide_dist='normal', n_steps=SVI_num_steps)
1055-
10561047
```
10571048
10581049
```{code-cell} ipython3

0 commit comments

Comments
 (0)