Skip to content

Commit a473028

Browse files
committed
Merge branch 'BART' of https://github.com/aloctavodia/pymc3 into BART
2 parents bb69a76 + 2050958 commit a473028

File tree

76 files changed

+7127
-3122
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

76 files changed

+7127
-3122
lines changed

.pre-commit-config.yaml

Lines changed: 21 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,35 @@
11
repos:
22
- repo: https://github.com/pre-commit/pre-commit-hooks
3-
rev: v3.2.0
3+
rev: v3.3.0
44
hooks:
55
- id: end-of-file-fixer
66
- id: check-toml
77
- repo: https://github.com/nbQA-dev/nbQA
8-
rev: 0.2.3
8+
rev: 0.3.5
99
hooks:
10-
- id: nbqa
11-
args: ['isort']
12-
name: nbqa-isort
13-
alias: nbqa-isort
14-
additional_dependencies: ['isort']
15-
- id: nbqa
16-
args: ['pyupgrade']
17-
name: nbqa-pyupgrade
18-
alias: nbqa-pyupgrade
19-
additional_dependencies: ['pyupgrade']
10+
- id: nbqa-black
11+
- id: nbqa-isort
12+
- id: nbqa-pyupgrade
2013
- repo: https://github.com/asottile/pyupgrade
21-
rev: v2.7.2
14+
rev: v2.7.3
2215
hooks:
2316
- id: pyupgrade
24-
args: ['--py36-plus']
17+
args: [--py36-plus]
2518
- repo: https://github.com/psf/black
2619
rev: 20.8b1
2720
hooks:
2821
- id: black
22+
- repo: local
23+
hooks:
24+
- id: watermark
25+
name: Check notebooks have watermark (see Jupyter style guide from PyMC3 Wiki)
26+
types: [jupyter]
27+
entry: '%load_ext watermark.*%watermark -n -u -v -iv -w'
28+
language: pygrep
29+
args: [--negate, --multiline]
30+
minimum_pre_commit_version: 2.8.0
31+
- id: check-toc
32+
name: Check all notebooks appear in table of contents
33+
types: [jupyter]
34+
entry: python scripts/check_toc_is_complete.py
35+
language: python

README.rst

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,19 @@ large suite of problems.
1313
Check out the `getting started guide <http://docs.pymc.io/notebooks/getting_started>`__, or
1414
`interact with live examples <https://mybinder.org/v2/gh/pymc-devs/pymc3/master?filepath=%2Fdocs%2Fsource%2Fnotebooks>`__
1515
using Binder!
16+
For questions on PyMC3, head on over to our `PyMC Discourse <https://discourse.pymc.io/>`__ forum.
17+
18+
The future of PyMC3 & Theano
19+
============================
20+
21+
There have been many questions and uncertainty around the future of PyMC3 since Theano
22+
stopped getting developed by the original authors, and we started experiments with PyMC4.
23+
24+
We are happy to announce that PyMC3 on Theano (which we are `developing further <https://github.com/pymc-devs/Theano-PyMC>`__)
25+
with a new JAX backend is the future. PyMC4 will not be developed further.
26+
27+
See the `full announcement <https://pymc-devs.medium.com/the-future-of-pymc3-or-theano-is-dead-long-live-theano-d8005f8a0e9b>`__
28+
for more details.
1629

1730
Features
1831
========
@@ -167,7 +180,7 @@ Contributors
167180
============
168181

169182
See the `GitHub contributor
170-
page <https://github.com/pymc-devs/pymc3/graphs/contributors>`__
183+
page <https://github.com/pymc-devs/pymc3/graphs/contributors>`__. Also read our `Code of Conduct <https://github.com/pymc-devs/pymc3/blob/master/CODE_OF_CONDUCT.md>`__ guidelines for a better contributing experience.
171184

172185
Support
173186
=======

RELEASE-NOTES.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,15 +8,19 @@
88
- Fixed numerical instability in ExGaussian's logp by preventing `logpow` from returning `-inf` (see [#4050](https://github.com/pymc-devs/pymc3/pull/4050)).
99
- Use dill to serialize user defined logp functions in `DensityDist`. The previous serialization code fails if it is used in notebooks on Windows and Mac. `dill` is now a required dependency. (see [#3844](https://github.com/pymc-devs/pymc3/issues/3844)).
1010
- Numerically improved stickbreaking transformation - e.g. for the `Dirichlet` distribution. [#4129](https://github.com/pymc-devs/pymc3/pull/4129)
11+
- Enabled the `Multinomial` distribution to handle batch sizes that have more than 2 dimensions. [#4169](https://github.com/pymc-devs/pymc3/pull/4169)
1112

1213
### Documentation
1314

1415
### New features
1516
- `sample_posterior_predictive_w` can now feed on `xarray.Dataset` - e.g. from `InferenceData.posterior`. (see [#4042](https://github.com/pymc-devs/pymc3/pull/4042))
17+
- Added `pymc3.gp.cov.Circular` kernel for Gaussian Processes on circular domains, e.g. the unit circle (see [#4082](https://github.com/pymc-devs/pymc3/pull/4082)).
1618
- Add MLDA, a new stepper for multilevel sampling. MLDA can be used when a hierarchy of approximate posteriors of varying accuracy is available, offering improved sampling efficiency especially in high-dimensional problems and/or where gradients are not available (see [#3926](https://github.com/pymc-devs/pymc3/pull/3926))
1719
- Change SMC metropolis kernel to independent metropolis kernel [#4115](https://github.com/pymc-devs/pymc3/pull/4115))
1820
- Add alternative parametrization to NegativeBinomial distribution in terms of n and p (see [#4126](https://github.com/pymc-devs/pymc3/issues/4126))
1921
- Add Bayesian Additive Regression Trees (BARTs) [#4183](https://github.com/pymc-devs/pymc3/pull/4183))
22+
- Added a new `MixtureSameFamily` distribution to handle mixtures of arbitrary dimensions in vectorized form (see [#4185](https://github.com/pymc-devs/pymc3/issues/4185)).
23+
2024

2125

2226
## PyMC3 3.9.3 (11 August 2020)

docs/source/api/distributions/mixture.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ Mixture
66
.. autosummary::
77
Mixture
88
NormalMixture
9+
MixtureSameFamily
910

1011
.. automodule:: pymc3.distributions.mixture
1112
:members:

docs/source/learn.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,7 @@
9999

100100
<div class="card">
101101
<div class="image">
102-
<img src="https://camo.githubusercontent.com/dbb5f926b3bfe7e2196044462ce2411ee3a001ee/68747470733a2f2f39623865303033322d612d36326362336131612d732d73697465732e676f6f676c6567726f7570732e636f6d2f736974652f646f696e67626179657369616e64617461616e616c797369732f776861742d732d6e65772d696e2d326e642d65642f436f7665724442444132452d46726f6e744f6e6c792d363030776964652e706e67">
102+
<img src="https://lh5.googleusercontent.com/R1T8ZXbMi4vSO0JlnLQMkEQNvd2ncBb23OmHOsmw-t_oEOF6jJlfWuJoOK0MMmECSDymhUdfTS2yoMgkR2TY-xIiBTHCpeuYjzXqD3xhZ-MuIhs2ARcJ=w1280">
103103
</div>
104104
<div class="content">
105105
<div class="header">Doing Bayesian Data Analysis</div>
@@ -112,7 +112,7 @@
112112
<tbody>
113113
<tr>
114114
<td>
115-
<a href="http://doingbayesiandataanalysis.blogspot.com/">
115+
<a href="https://sites.google.com/site/doingbayesiandataanalysis/home">
116116
<i class="linkify icon"></i> Book website
117117
</a>
118118
</td>

docs/source/notebooks/GLM-hierarchical.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@
3636
"Gelman et al.'s (2007) radon dataset is a classic for hierarchical modeling. In this dataset the amount of the radioactive gas radon has been measured among different households in all counties of several states. Radon gas is known to be the highest cause of lung cancer in non-smokers. It is believed to be more strongly present in households containing a basement and to differ in amount present among types of soil.\n",
3737
"Here we'll investigate this differences and try to make predictions of radonlevels in different counties based on the county itself and the presence of a basement. In this example we'll look at Minnesota, a state that contains 85 counties in which different measurements are taken, ranging from 2 to 116 measurements per county. \n",
3838
"\n",
39-
"![radon](http://www.fix-your-radon.com/images/how_radon_enters.jpg)"
39+
"![radon](https://upload.wikimedia.org/wikipedia/commons/b/b9/CNX_Chem_21_06_RadonExpos.png)"
4040
]
4141
},
4242
{

docs/source/notebooks/GLM-negative-binomial-regression.ipynb

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -383,9 +383,7 @@
383383
}
384384
],
385385
"source": [
386-
"g = sns.catplot(\n",
387-
" x=\"nsneeze\", row=\"nomeds\", col=\"alcohol\", data=df, kind=\"count\", aspect=1.5\n",
388-
")\n",
386+
"g = sns.catplot(x=\"nsneeze\", row=\"nomeds\", col=\"alcohol\", data=df, kind=\"count\", aspect=1.5)\n",
389387
"\n",
390388
"# Make x-axis ticklabels less crowded\n",
391389
"ax = g.axes[1, 0]\n",
@@ -464,9 +462,7 @@
464462
"fml = \"nsneeze ~ alcohol + nomeds + alcohol:nomeds\"\n",
465463
"\n",
466464
"with pm.Model() as model:\n",
467-
" pm.glm.GLM.from_formula(\n",
468-
" formula=fml, data=df, family=pm.glm.families.NegativeBinomial()\n",
469-
" )\n",
465+
" pm.glm.GLM.from_formula(formula=fml, data=df, family=pm.glm.families.NegativeBinomial())\n",
470466
" trace = pm.sample(1000, tune=1000, cores=2, return_inferencedata=True)"
471467
]
472468
},

docs/source/notebooks/GP-Circular.ipynb

Lines changed: 650 additions & 0 deletions
Large diffs are not rendered by default.

docs/source/notebooks/GP-MaunaLoa2.ipynb

Lines changed: 10 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@
6868
"%config InlineBackend.figure_format = 'retina'\n",
6969
"RANDOM_SEED = 8927\n",
7070
"np.random.seed(RANDOM_SEED)\n",
71-
"az.style.use('arviz-darkgrid')"
71+
"az.style.use(\"arviz-darkgrid\")"
7272
]
7373
},
7474
{
@@ -1310,9 +1310,7 @@
13101310
" ηq = pm.HalfNormal(\"ηq\", sigma=5)\n",
13111311
"\n",
13121312
" cov1 = η ** 2 * pm.gp.cov.ExpQuad(1, ℓ)\n",
1313-
" cov2 = η ** 2 * pm.gp.cov.ExpQuad(1, ℓ) + ηq ** 2 * pm.gp.cov.Polynomial(\n",
1314-
" 1, x0, 2, c\n",
1315-
" )\n",
1313+
" cov2 = η ** 2 * pm.gp.cov.ExpQuad(1, ℓ) + ηq ** 2 * pm.gp.cov.Polynomial(1, x0, 2, c)\n",
13161314
"\n",
13171315
" # construct changepoint cov\n",
13181316
" sc_cov1 = pm.gp.cov.ScaledCov(1, cov1, logistic, (-a, x0))\n",
@@ -1799,10 +1797,10 @@
17991797
"outputs": [],
18001798
"source": [
18011799
"class CustomWhiteNoise(pm.gp.cov.Covariance):\n",
1802-
" \"\"\" Custom White Noise covariance\n",
1800+
" \"\"\"Custom White Noise covariance\n",
18031801
" - sigma1 is applied to the first n1 points in the data\n",
18041802
" - sigma2 is applied to the next n2 points in the data\n",
1805-
" \n",
1803+
"\n",
18061804
" The total number of data points n = n1 + n2\n",
18071805
" \"\"\"\n",
18081806
"\n",
@@ -1922,11 +1920,7 @@
19221920
" ℓp_decay = pm.Gamma(\"ℓp_decay\", alpha=40, beta=0.1)\n",
19231921
" ℓp_smooth = pm.Normal(\"ℓp_smooth \", mu=1.0, sigma=0.05)\n",
19241922
" period = 1 * 0.01 # we know the period is annual\n",
1925-
" cov_p = (\n",
1926-
" ηp ** 2\n",
1927-
" * pm.gp.cov.Periodic(1, period, ℓp_smooth)\n",
1928-
" * pm.gp.cov.ExpQuad(1, ℓp_decay)\n",
1929-
" )\n",
1923+
" cov_p = ηp ** 2 * pm.gp.cov.Periodic(1, period, ℓp_smooth) * pm.gp.cov.ExpQuad(1, ℓp_decay)\n",
19301924
" gp_p = pm.gp.Marginal(cov_func=cov_p)\n",
19311925
"\n",
19321926
" gp = gp_c + gp_m + gp_s + gp_p\n",
@@ -1943,9 +1937,7 @@
19431937
" σ2 = pm.Gamma(\"σ2\", alpha=3, beta=50)\n",
19441938
" η_noise = pm.HalfNormal(\"η_noise\", sigma=1)\n",
19451939
" ℓ_noise = pm.Gamma(\"ℓ_noise\", alpha=2, beta=200)\n",
1946-
" cov_noise = η_noise ** 2 * pm.gp.cov.Matern32(1, ℓ_noise) + CustomWhiteNoise(\n",
1947-
" σ1, σ2, 111, 545\n",
1948-
" )\n",
1940+
" cov_noise = η_noise ** 2 * pm.gp.cov.Matern32(1, ℓ_noise) + CustomWhiteNoise(σ1, σ2, 111, 545)\n",
19491941
"\n",
19501942
" y_ = gp.marginal_likelihood(\"y\", X=t_combined[:, None], y=y_n, noise=cov_noise)"
19511943
]
@@ -2141,9 +2133,7 @@
21412133
],
21422134
"source": [
21432135
"plt.figure(figsize=(12, 5))\n",
2144-
"plt.plot(\n",
2145-
" tnew * 100, y_sd * ppc[\"fnew\"][0:200:5, :].T + y_mu, color=\"lightblue\", alpha=0.8\n",
2146-
")\n",
2136+
"plt.plot(tnew * 100, y_sd * ppc[\"fnew\"][0:200:5, :].T + y_mu, color=\"lightblue\", alpha=0.8)\n",
21472137
"plt.plot(\n",
21482138
" [-1000, -1001],\n",
21492139
" [-1000, -1001],\n",
@@ -2198,9 +2188,7 @@
21982188
],
21992189
"source": [
22002190
"plt.figure(figsize=(12, 5))\n",
2201-
"plt.plot(\n",
2202-
" tnew * 100, y_sd * ppc[\"fnew\"][0:200:5, :].T + y_mu, color=\"lightblue\", alpha=0.8\n",
2203-
")\n",
2191+
"plt.plot(tnew * 100, y_sd * ppc[\"fnew\"][0:200:5, :].T + y_mu, color=\"lightblue\", alpha=0.8)\n",
22042192
"plt.plot(\n",
22052193
" [-1000, -1001],\n",
22062194
" [-1000, -1001],\n",
@@ -2325,9 +2313,7 @@
23252313
"source": [
23262314
"plt.figure(figsize=(12, 5))\n",
23272315
"\n",
2328-
"plt.plot(\n",
2329-
" tnew * 100, y_sd * ppc[\"fnew2\"][0:200:10, :].T + y_mu, color=\"lightblue\", alpha=0.8\n",
2330-
")\n",
2316+
"plt.plot(tnew * 100, y_sd * ppc[\"fnew2\"][0:200:10, :].T + y_mu, color=\"lightblue\", alpha=0.8)\n",
23312317
"plt.plot(\n",
23322318
" [-1000, -1001],\n",
23332319
" [-1000, -1001],\n",
@@ -2452,9 +2438,7 @@
24522438
"source": [
24532439
"plt.figure(figsize=(12, 5))\n",
24542440
"\n",
2455-
"plt.plot(\n",
2456-
" tnew * 100, y_sd * ppc[\"fnew3\"][0:200:10, :].T + y_mu, color=\"lightblue\", alpha=0.8\n",
2457-
")\n",
2441+
"plt.plot(tnew * 100, y_sd * ppc[\"fnew3\"][0:200:10, :].T + y_mu, color=\"lightblue\", alpha=0.8)\n",
24582442
"plt.plot(\n",
24592443
" [-1000, -1001],\n",
24602444
" [-1000, -1001],\n",

docs/source/notebooks/GP-MeansAndCovs.ipynb

Lines changed: 359 additions & 207 deletions
Large diffs are not rendered by default.

docs/source/notebooks/MLDA_gravity_surveying.ipynb

Lines changed: 540 additions & 817 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)