Skip to content

Update Jupyter style in lasso notebook #279

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Mar 4, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
320 changes: 175 additions & 145 deletions examples/pymc3_howto/lasso_block_update.ipynb

Large diffs are not rendered by default.

103 changes: 74 additions & 29 deletions myst_nbs/pymc3_howto/lasso_block_update.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,74 +6,119 @@ jupytext:
format_version: 0.13
jupytext_version: 1.13.7
kernelspec:
display_name: Python PyMC (Dev)
display_name: Python 3 (ipykernel)
language: python
name: pymc-dev-py38
name: python3
---

(lasso_block_update)=
# Lasso regression with block updating

Sometimes, it is very useful to update a set of parameters together. For example, variables that are highly correlated are often good to update together. In PyMC 3 block updating is simple, as example will demonstrate.

Here we have a LASSO regression model where the two coefficients are strongly correlated. Normally, we would define the coefficient parameters as a single random variable, but here we define them separately to show how to do block updates.

First we generate some fake data.
:::{post} Feb 10, 2022
:tags: pymc3.Exponential, pymc3.Laplace, pymc3.Metropolis, pymc3.Model, pymc3.Normal, pymc3.Slice, pymc3.Uniform, regression
:category: beginner
:author: Chris Fonnesbeck, Raul Maldonado, Michael Osthege, Thomas Wiecki, Lorenzo Toniazzi
:::

```{code-cell} ipython3
%matplotlib inline
:tags: []

%matplotlib inline
import arviz as az
import matplotlib.pyplot as plt
import numpy as np
import pymc as pm

from matplotlib import pylab
print(f"Running on PyMC v{pm.__version__}")
```

```{code-cell} ipython3
RANDOM_SEED = 8927
rng = np.random.default_rng(RANDOM_SEED)
az.style.use("arviz-darkgrid")
```

Sometimes, it is very useful to update a set of parameters together. For example, variables that are highly correlated are often good to update together. In PyMC block updating is simple. This will be demonstrated using the parameter `step` of {class}`pymc.sample`.

Here we have a [LASSO regression model](https://en.wikipedia.org/wiki/Lasso_(statistics)#Bayesian_interpretation) where the two coefficients are strongly correlated. Normally, we would define the coefficient parameters as a single random variable, but here we define them separately to show how to do block updates.

First we generate some fake data.

d = np.random.normal(size=(3, 30))
d1 = d[0] + 4
d2 = d[1] + 4
yd = 0.2 * d1 + 0.3 * d2 + d[2]
```{code-cell} ipython3
x = rng.standard_normal(size=(3, 30))
x1 = x[0] + 4
x2 = x[1] + 4
noise = x[2]
y_obs = x1 * 0.2 + x2 * 0.3 + noise
```

Then define the random variables.

```{code-cell} ipython3
lam = 3
:tags: []

lam = 3000

with pm.Model() as model:
s = pm.Exponential("s", 1)
tau = pm.Uniform("tau", 0, 1000)
sigma = pm.Exponential("sigma", 1)
tau = pm.Uniform("tau", 0, 1)
b = lam * tau
m1 = pm.Laplace("m1", 0, b)
m2 = pm.Laplace("m2", 0, b)
beta1 = pm.Laplace("beta1", 0, b)
beta2 = pm.Laplace("beta2", 0, b)

p = d1 * m1 + d2 * m2
mu = x1 * beta1 + x2 * beta2

y = pm.Normal("y", mu=p, sigma=s, observed=yd)
y = pm.Normal("y", mu=mu, sigma=sigma, observed=y_obs)
```

For most samplers, including Metropolis and HamiltonianMC, simply pass a list of variables to sample as a block. This works with both scalar and array parameters.
For most samplers, including {class}`pymc.Metropolis` and {class}`pymc.HamiltonianMC`, simply pass a list of variables to sample as a block. This works with both scalar and array parameters.

```{code-cell} ipython3
with model:
start = pm.find_MAP()
step1 = pm.Metropolis([beta1, beta2])

step1 = pm.Metropolis([m1, m2])
step2 = pm.Slice([sigma, tau])

step2 = pm.Slice([s, tau])

idata = pm.sample(10000, [step1, step2], start=start)
idata = pm.sample(draws=10000, step=[step1, step2])
```

We conclude by plotting the sampled marginals and the joint distribution of `beta1` and `beta2`.

```{code-cell} ipython3
:tags: []

az.plot_trace(idata);
```

```{code-cell} ipython3
pylab.hexbin(idata.posterior["m1"], idata.posterior["m2"], gridsize=50)
pylab.axis("off");
az.plot_pair(
idata,
var_names=["beta1", "beta2"],
kind="hexbin",
marginals=True,
figsize=(10, 10),
gridsize=50,
)
```

## Authors

* Authored by [Chris Fonnesbeck](https://github.com/fonnesbeck) in Dec, 2020
* Updated by [Raul Maldonado](https://github.com/CloudChaoszero) in Jan, 2021
* Updated by Raul Maldonado in Mar, 2021
* Reexecuted by [Thomas Wiecki](https://github.com/twiecki) and [Michael Osthege](https://github.com/michaelosthege) with PyMC v4 in Jan, 2022 ([pymc-examples#264](https://github.com/pymc-devs/pymc-examples/pull/264))
* Updated by [Lorenzo Toniazzi](https://github.com/ltoniazzi) in Feb, 2022 ([pymc-examples#279](https://github.com/pymc-devs/pymc-examples/pull/279))

+++

## Watermark

```{code-cell} ipython3
:tags: []

%load_ext watermark
%watermark -n -u -v -iv -w
%watermark -n -u -v -iv -w -p aesara,aeppl,xarray
```

:::{include} ../page_footer.md
:::