You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Updated ADVI minibatch and API quickstart
* notebook: GLM robust (run with pymc v5) (#499)
* run with pymc v5
* add myst file
* notebook: glm hierarchical, run pymc v5 (#498)
* run pymc v5
* add myst file
* Updated ADVI minibatch and API quickstart
* Added headers and footers to quickstart notebook
* Added headers and footers to quickstart notebook
* Update empirical approximation to v5
* Added entry to update history in empirical VI notebook
* Minor language tweak; remove gaussian mixture ADVI example
* Removed instances of pymc3
Copy file name to clipboardExpand all lines: examples/variational_inference/GLM-hierarchical-advi-minibatch.myst.md
+34-16
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ jupytext:
5
5
format_name: myst
6
6
format_version: 0.13
7
7
kernelspec:
8
-
display_name: Python 3
8
+
display_name: pie
9
9
language: python
10
10
name: python3
11
11
---
@@ -22,22 +22,22 @@ kernelspec:
22
22
Unlike Gaussian mixture models, (hierarchical) regression models have independent variables. These variables affect the likelihood function, but are not random variables. When using mini-batch, we should take care of that.
Copy file name to clipboardExpand all lines: examples/variational_inference/empirical-approx-overview.myst.md
+38-24
Original file line number
Diff line number
Diff line change
@@ -5,32 +5,40 @@ jupytext:
5
5
format_name: myst
6
6
format_version: 0.13
7
7
kernelspec:
8
-
display_name: Python PyMC3 (Dev)
8
+
display_name: pie
9
9
language: python
10
-
name: pymc3-dev-py38
10
+
name: python3
11
11
---
12
12
13
+
(empirical-approx-overview)=
14
+
13
15
# Empirical Approximation overview
14
16
15
-
For most models we use sampling MCMC algorithms like Metropolis or NUTS. In PyMC3 we got used to store traces of MCMC samples and then do analysis using them. There is a similar concept for the variational inference submodule in PyMC3: *Empirical*. This type of approximation stores particles for the SVGD sampler. There is no difference between independent SVGD particles and MCMC samples. *Empirical* acts as a bridge between MCMC sampling output and full-fledged VI utils like `apply_replacements` or `sample_node`. For the interface description, see [variational_api_quickstart](variational_api_quickstart.ipynb). Here we will just focus on `Emprical` and give an overview of specific things for the *Empirical* approximation
17
+
For most models we use sampling MCMC algorithms like Metropolis or NUTS. In PyMC we got used to store traces of MCMC samples and then do analysis using them. There is a similar concept for the variational inference submodule in PyMC: *Empirical*. This type of approximation stores particles for the SVGD sampler. There is no difference between independent SVGD particles and MCMC samples. *Empirical* acts as a bridge between MCMC sampling output and full-fledged VI utils like `apply_replacements` or `sample_node`. For the interface description, see [variational_api_quickstart](variational_api_quickstart.ipynb). Here we will just focus on `Emprical` and give an overview of specific things for the *Empirical* approximation.
18
+
19
+
:::{post} Jan 13, 2023
20
+
:tags: variational inference, approximation
21
+
:category: advaned, how-to
22
+
:author: Maxim Kochurov, Raul Maldonado, Chris Fonnesbeck
23
+
:::
16
24
17
25
```{code-cell} ipython3
18
26
import arviz as az
19
27
import matplotlib.pyplot as plt
20
28
import numpy as np
21
-
import pymc3 as pm
22
-
import theano
29
+
import pymc as pm
30
+
import pytensor
31
+
import seaborn as sns
23
32
24
33
from pandas import DataFrame
25
34
26
-
print(f"Running on PyMC3 v{pm.__version__}")
35
+
print(f"Running on PyMC v{pm.__version__}")
27
36
```
28
37
29
38
```{code-cell} ipython3
30
39
%config InlineBackend.figure_format = 'retina'
31
40
az.style.use("arviz-darkgrid")
32
41
np.random.seed(42)
33
-
pm.set_tt_rng(42)
34
42
```
35
43
36
44
## Multimodal density
@@ -42,12 +50,14 @@ mu = pm.floatX([-0.3, 0.5])
42
50
sd = pm.floatX([0.1, 0.1])
43
51
44
52
with pm.Model() as model:
45
-
x = pm.NormalMixture("x", w=w, mu=mu, sigma=sd, dtype=theano.config.floatX)
0 commit comments