Skip to content

Commit 95be142

Browse files
Rerun Latent GP with method="svd" for MvNormals
MvNormal was switched from SVD to Cholesky decomposition by default. This is brittle for many GPs, therefore PyMC > 5.22.0 will defaults Latent GP conditionals to `method="svd"`. The example notebook includes manually created `MvNormal`s that also need `method="svd"` to work. I also switched from numpyro back to the default sampler, because I couldn't install numpyro on my Windows. See pymc-devs/pymc#7754.
1 parent 8db7cc4 commit 95be142

File tree

2 files changed

+129
-154
lines changed

2 files changed

+129
-154
lines changed

examples/gaussian_processes/GP-Latent.ipynb

+120-147
Large diffs are not rendered by default.

examples/gaussian_processes/GP-Latent.myst.md

+9-7
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@ jupytext:
55
format_name: myst
66
format_version: 0.13
77
kernelspec:
8-
display_name: pymc-examples
8+
display_name: ptdev
99
language: python
10-
name: pymc-examples
10+
name: python3
1111
myst:
1212
substitutions:
1313
extra_dependencies: jax numpyro
@@ -119,7 +119,9 @@ cov_func = eta_true**2 * pm.gp.cov.ExpQuad(1, ell_true)
119119
mean_func = pm.gp.mean.Zero()
120120
121121
# The latent function values are one sample from a multivariate normal
122-
f_true = pm.draw(pm.MvNormal.dist(mu=mean_func(X), cov=cov_func(X)), 1, random_seed=rng)
122+
f_true = pm.draw(
123+
pm.MvNormal.dist(mu=mean_func(X), cov=cov_func(X), method="svd"), 1, random_seed=rng
124+
)
123125
124126
# The observed data is the latent function plus a small amount of T distributed noise
125127
# The standard deviation of the noise is `sigma`, and the degrees of freedom is `nu`
@@ -163,7 +165,7 @@ with pm.Model() as model:
163165
) # add one because student t is undefined for degrees of freedom less than one
164166
y_ = pm.StudentT("y", mu=f, lam=1.0 / sigma, nu=nu, observed=y)
165167
166-
idata = pm.sample(nuts_sampler="numpyro")
168+
idata = pm.sample()
167169
```
168170

169171
```{code-cell} ipython3
@@ -313,7 +315,7 @@ K = cov_func(x[:, None]).eval()
313315
mean = np.zeros(n)
314316
315317
# sample from the gp prior
316-
f_true = pm.draw(pm.MvNormal.dist(mu=mean, cov=K), 1, random_seed=rng)
318+
f_true = pm.draw(pm.MvNormal.dist(mu=mean, cov=K, method="svd"), 1, random_seed=rng)
317319
318320
# Sample the GP through the likelihood
319321
y = pm.Bernoulli.dist(p=pm.math.invlogit(f_true)).eval()
@@ -354,7 +356,7 @@ with pm.Model() as model:
354356
p = pm.Deterministic("p", pm.math.invlogit(f))
355357
y_ = pm.Bernoulli("y", p=p, observed=y)
356358
357-
idata = pm.sample(1000, chains=2, cores=2, nuts_sampler="numpyro")
359+
idata = pm.sample(1000, chains=2, cores=2)
358360
```
359361

360362
```{code-cell} ipython3
@@ -442,7 +444,7 @@ plt.legend(loc=(0.32, 0.65), frameon=True);
442444

443445
```{code-cell} ipython3
444446
%load_ext watermark
445-
%watermark -n -u -v -iv -w -p pytensor,aeppl,xarray
447+
%watermark -n -u -v -iv -w -p pytensor,xarray
446448
```
447449

448450
:::{include} ../page_footer.md

0 commit comments

Comments
 (0)