Skip to content

Commit 648617d

Browse files
authored
Add GitHubAction to auto-update hooks, and other cleanups (#4314)
* ⬆️ update pre-commit hooks * try restarting (flaky test?)
1 parent 6a299a3 commit 648617d

16 files changed

+129
-85
lines changed

.codecov.yml

+2-2
Original file line numberDiff line numberDiff line change
@@ -13,13 +13,13 @@ coverage:
1313
# basic
1414
target: auto
1515
threshold: 1%
16-
base: auto
16+
base: auto
1717
patch:
1818
default:
1919
# basic
2020
target: 50%
2121
threshold: 1%
22-
base: auto
22+
base: auto
2323

2424
comment:
2525
layout: "reach, diff, flags, files"

.github/ISSUE_TEMPLATE.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
If you have questions about a specific use case, or you are not sure whether this is a bug or not, please post it to our discourse channel: https://discourse.pymc.io
1+
If you have questions about a specific use case, or you are not sure whether this is a bug or not, please post it to our discourse channel: https://discourse.pymc.io
22

33
## Description of your problem
44

Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
name: "Update pre-commit config"
2+
3+
on:
4+
schedule:
5+
- cron: "0 7 * * 1" # At 07:00 on each Monday.
6+
workflow_dispatch:
7+
8+
jobs:
9+
update-pre-commit:
10+
if: github.repository_owner == 'pymc-devs'
11+
name: Autoupdate pre-commit config
12+
runs-on: ubuntu-latest
13+
steps:
14+
- name: Set up Python
15+
uses: actions/setup-python@v2
16+
- name: Cache multiple paths
17+
uses: actions/cache@v2
18+
with:
19+
path: |
20+
~/.cache/pre-commit
21+
~/.cache/pip
22+
key: pre-commit-autoupdate-${{ runner.os }}-build
23+
- name: Update pre-commit config packages
24+
uses: technote-space/create-pr-action@v2
25+
with:
26+
GITHUB_TOKEN: ${{ secrets.ACTION_TRIGGER_TOKEN }}
27+
EXECUTE_COMMANDS: |
28+
pip install pre-commit
29+
pre-commit autoupdate || (exit 0);
30+
pre-commit run -a || (exit 0);
31+
COMMIT_MESSAGE: "⬆️ UPGRADE: Autoupdate pre-commit config"
32+
PR_BRANCH_NAME: "pre-commit-config-update-${PR_ID}"
33+
PR_TITLE: "⬆️ UPGRADE: Autoupdate pre-commit config"

.github/workflows/pre-commit.yml

+2
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ on:
88
jobs:
99
pre-commit:
1010
runs-on: ubuntu-latest
11+
env:
12+
SKIP: no-commit-to-branch
1113
steps:
1214
- uses: actions/checkout@v2
1315
- uses: actions/setup-python@v2

.github/workflows/pytest.yml

+2
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ jobs:
2929
--ignore=pymc3/tests/test_shape_handling.py
3030
--ignore=pymc3/tests/test_shared.py
3131
--ignore=pymc3/tests/test_smc.py
32+
--ignore=pymc3/tests/test_step.py
3233
--ignore=pymc3/tests/test_updates.py
3334
--ignore=pymc3/tests/test_variational_inference.py
3435
- |
@@ -47,6 +48,7 @@ jobs:
4748
- |
4849
pymc3/tests/test_distributions_timeseries.py
4950
pymc3/tests/test_shape_handling.py
51+
pymc3/tests/test_step.py
5052
pymc3/tests/test_updates.py
5153
pymc3/tests/test_variational_inference.py
5254
- |

.pre-commit-config.yaml

+11-4
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,31 @@
1+
exclude: ^(docs/logos|pymc3/examples/data)/
12
repos:
23
- repo: https://github.com/pre-commit/pre-commit-hooks
34
rev: v3.3.0
45
hooks:
5-
- id: end-of-file-fixer
6+
- id: check-merge-conflict
67
- id: check-toml
8+
- id: check-yaml
9+
- id: debug-statements
10+
- id: end-of-file-fixer
11+
- id: no-commit-to-branch
12+
args: [--branch, master]
13+
- id: requirements-txt-fixer
14+
- id: trailing-whitespace
715
- repo: https://github.com/nbQA-dev/nbQA
8-
rev: 0.4.1
16+
rev: 0.5.4
917
hooks:
1018
- id: nbqa-black
1119
additional_dependencies: [black==20.8b1]
1220
- id: nbqa-isort
1321
additional_dependencies: [isort==5.6.4]
1422
- id: nbqa-pyupgrade
1523
additional_dependencies: [pyupgrade==2.7.4]
16-
1724
- repo: https://github.com/PyCQA/isort
1825
rev: 5.6.4
1926
hooks:
2027
- id: isort
21-
name: isort (python)
28+
name: isort
2229
- repo: https://github.com/asottile/pyupgrade
2330
rev: v2.7.4
2431
hooks:

GOVERNANCE.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ developed openly and hosted in public GitHub repositories under the
1313
[GitHub organization](https://github.com/pymc-devs/pymc3). Examples of
1414
Project Software include the PyMC3 code and the Documentation, etc. The Services run by the
1515
Project consist of public websites and web-services that are hosted
16-
at [http://pymc-devs.github.io/pymc3/](http://pymc-devs.github.io/pymc3/)
16+
at [http://pymc-devs.github.io/pymc3/](http://pymc-devs.github.io/pymc3/)
1717
The Project is developed by a team of distributed developers, called
1818
Contributors. Contributors are individuals who have contributed code,
1919
documentation, designs or other work to one or more Project repositories.
@@ -131,7 +131,7 @@ The current Steering Council membership comprises:
131131
- Junpeng Lao
132132
- Osvaldo Martin
133133
- Austin Rochford
134-
- Adrian Seyboldt
134+
- Adrian Seyboldt
135135
- Thomas Wiecki
136136

137137
### Council membership

README.rst

+4-4
Original file line numberDiff line numberDiff line change
@@ -21,11 +21,11 @@ The future of PyMC3 & Theano
2121
There have been many questions and uncertainty around the future of PyMC3 since Theano
2222
stopped getting developed by the original authors, and we started experiments with PyMC4.
2323

24-
We are happy to announce that PyMC3 on Theano (which we are `developing further <https://github.com/pymc-devs/Theano-PyMC>`__)
25-
with a new JAX backend is the future. PyMC4 will not be developed further.
24+
We are happy to announce that PyMC3 on Theano (which we are `developing further <https://github.com/pymc-devs/Theano-PyMC>`__)
25+
with a new JAX backend is the future. PyMC4 will not be developed further.
2626

2727
See the `full announcement <https://pymc-devs.medium.com/the-future-of-pymc3-or-theano-is-dead-long-live-theano-d8005f8a0e9b>`__
28-
for more details.
28+
for more details.
2929

3030
Features
3131
========
@@ -119,7 +119,7 @@ Another option is to clone the repository and install PyMC3 using
119119
Dependencies
120120
============
121121

122-
PyMC3 is tested on Python 3.6, 3.7, and 3.8 and depends on `Theano-PyMC <https://github.com/pymc-devs/Theano-PyMC>`__,
122+
PyMC3 is tested on Python 3.6, 3.7, and 3.8 and depends on `Theano-PyMC <https://github.com/pymc-devs/Theano-PyMC>`__,
123123
NumPy, SciPy, and Pandas
124124
(see `requirements.txt <https://github.com/pymc-devs/pymc3/blob/master/requirements.txt>`__ for version
125125
information).

binder/requirements.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
-r ../requirements-dev.txt
2-
# this installs pymc3 itself. it is funny that this is an absolute path,
2+
# this installs pymc3 itself. it is funny that this is an absolute path,
33
# but reqirements-dev.txt is relative.
44
.

docs/source/Gaussian_Processes.rst

+36-36
Original file line numberDiff line numberDiff line change
@@ -21,17 +21,17 @@ choice as priors over functions due to the marginalization and conditioning
2121
properties of the multivariate normal distribution. Usually, the marginal
2222
distribution over :math:`f(x)` is evaluated during the inference step. The
2323
conditional distribution is then used for predicting the function values
24-
:math:`f(x_*)` at new points, :math:`x_*`.
24+
:math:`f(x_*)` at new points, :math:`x_*`.
2525

2626
The joint distribution of :math:`f(x)` and :math:`f(x_*)` is multivariate
2727
normal,
2828

2929
.. math::
3030
3131
\begin{bmatrix} f(x) \\ f(x_*) \\ \end{bmatrix} \sim
32-
\text{N}\left(
32+
\text{N}\left(
3333
\begin{bmatrix} m(x) \\ m(x_*) \\ \end{bmatrix} \,,
34-
\begin{bmatrix} k(x,x') & k(x_*, x) \\
34+
\begin{bmatrix} k(x,x') & k(x_*, x) \\
3535
k(x_*, x) & k(x_*, x_*') \\ \end{bmatrix}
3636
\right) \,.
3737
@@ -41,21 +41,21 @@ distribution is
4141

4242
.. math::
4343
44-
f(x_*) \mid f(x) \sim \text{N}\left( k(x_*, x) k(x, x)^{-1} [f(x) - m(x)] + m(x_*) ,\,
44+
f(x_*) \mid f(x) \sim \text{N}\left( k(x_*, x) k(x, x)^{-1} [f(x) - m(x)] + m(x_*) ,\,
4545
k(x_*, x_*) - k(x, x_*) k(x, x)^{-1} k(x, x_*) \right) \,.
4646
4747
.. note::
4848

4949
For more information on GPs, check out the book `Gaussian Processes for
5050
Machine Learning <http://www.gaussianprocess.org/gpml/>`_ by Rasmussen &
51-
Williams, or `this introduction <https://www.ics.uci.edu/~welling/teaching/KernelsICS273B/gpB.pdf>`_
51+
Williams, or `this introduction <https://www.ics.uci.edu/~welling/teaching/KernelsICS273B/gpB.pdf>`_
5252
by D. Mackay.
5353

5454
PyMC3 is a great environment for working with fully Bayesian Gaussian Process
55-
models. GPs in PyMC3 have a clear syntax and are highly composable, and many
56-
predefined covariance functions (or kernels), mean functions, and several GP
55+
models. GPs in PyMC3 have a clear syntax and are highly composable, and many
56+
predefined covariance functions (or kernels), mean functions, and several GP
5757
implementations are included. GPs are treated as distributions that can be
58-
used within larger or hierarchical models, not just as standalone regression
58+
used within larger or hierarchical models, not just as standalone regression
5959
models.
6060

6161
Mean and covariance functions
@@ -83,7 +83,7 @@ specify :code:`input_dim`, the total number of columns of :code:`X`, and
8383
:code:`active_dims`, which of those columns or dimensions the covariance
8484
function will act on, is because :code:`cov_func` hasn't actually seen the
8585
input data yet. The :code:`active_dims` argument is optional, and defaults to
86-
all columns of the matrix of inputs.
86+
all columns of the matrix of inputs.
8787

8888
Covariance functions in PyMC3 closely follow the algebraic rules for kernels,
8989
which allows users to combine covariance functions into new ones, for example:
@@ -97,13 +97,13 @@ which allows users to combine covariance functions into new ones, for example:
9797

9898

9999
cov_func = pm.gp.cov.ExpQuad(...) * pm.gp.cov.Periodic(...)
100-
100+
101101
- The product (or sum) of a covariance function with a scalar is a
102102
covariance function::
103103

104-
104+
105105
cov_func = eta**2 * pm.gp.cov.Matern32(...)
106-
106+
107107

108108

109109
After the covariance function is defined, it is now a function that is
@@ -133,7 +133,7 @@ is::
133133
The first argument is the mean function and the second is the covariance
134134
function. We've made the GP object, but we haven't made clear which function
135135
it is to be a prior for, what the inputs are, or what parameters it will be
136-
conditioned on.
136+
conditioned on.
137137

138138
.. note::
139139

@@ -145,18 +145,18 @@ conditioned on.
145145

146146
Calling the `prior` method will create a PyMC3 random variable that represents
147147
the latent function :math:`f(x) = \mathbf{f}`::
148-
148+
149149
f = gp.prior("f", X)
150150

151151
:code:`f` is a random variable that can be used within a PyMC3 model like any
152152
other type of random variable. The first argument is the name of the random
153-
variable representing the function we are placing the prior over.
154-
The second argument is the inputs to the function that the prior is over,
153+
variable representing the function we are placing the prior over.
154+
The second argument is the inputs to the function that the prior is over,
155155
:code:`X`. The inputs are usually known and present in the data, but they can
156-
also be PyMC3 random variables. If the inputs are a Theano tensor or a
156+
also be PyMC3 random variables. If the inputs are a Theano tensor or a
157157
PyMC3 random variable, the :code:`shape` needs to be given.
158158

159-
Usually at this point, inference is performed on the model. The
159+
Usually at this point, inference is performed on the model. The
160160
:code:`conditional` method creates the conditional, or predictive,
161161
distribution over the latent function at arbitrary :math:`x_*` input points,
162162
:math:`f(x_*)`. To construct the conditional distribution we write::
@@ -166,7 +166,7 @@ distribution over the latent function at arbitrary :math:`x_*` input points,
166166
Additive GPs
167167
============
168168

169-
The GP implementation in PyMC3 is constructed so that it is easy to define
169+
The GP implementation in PyMC3 is constructed so that it is easy to define
170170
additive GPs and sample from individual GP components. We can write::
171171

172172
gp1 = pm.gp.Marginal(mean_func1, cov_func1)
@@ -183,18 +183,18 @@ Consider two independent GP distributed functions, :math:`f_1(x) \sim
183183

184184
.. math::
185185
186-
\begin{bmatrix} f_1 \\ f_1^* \\ f_2 \\ f_2^*
186+
\begin{bmatrix} f_1 \\ f_1^* \\ f_2 \\ f_2^*
187187
\\ f_1 + f_2 \\ f_1^* + f_2^* \end{bmatrix} \sim
188-
\text{N}\left(
188+
\text{N}\left(
189189
\begin{bmatrix} m_1 \\ m_1^* \\ m_2 \\ m_2^* \\
190190
m_1 + m_2 \\ m_1^* + m_2^* \\ \end{bmatrix} \,,\,
191-
\begin{bmatrix}
191+
\begin{bmatrix}
192192
K_1 & K_1^* & 0 & 0 & K_1 & K_1^* \\
193193
K_1^{*^T} & K_1^{**} & 0 & 0 & K_1^* & K_1^{**} \\
194194
0 & 0 & K_2 & K_2^* & K_2 & K_2^{*} \\
195195
0 & 0 & K_2^{*^T} & K_2^{**} & K_2^{*} & K_2^{**} \\
196196
K_1 & K_1^{*} & K_2 & K_2^{*} & K_1 + K_2 & K_1^{*} + K_2^{*} \\
197-
K_1^{*^T} & K_1^{**} & K_2^{*^T} & K_2^{**} & K_1^{*^T}+K_2^{*^T} & K_1^{**}+K_2^{**}
197+
K_1^{*^T} & K_1^{**} & K_2^{*^T} & K_2^{**} & K_1^{*^T}+K_2^{*^T} & K_1^{**}+K_2^{**}
198198
\end{bmatrix}
199199
\right) \,.
200200
@@ -220,42 +220,42 @@ other implementations. The first block fits the GP prior. We denote
220220
with pm.Model() as model:
221221
gp1 = pm.gp.Marginal(mean_func1, cov_func1)
222222
gp2 = pm.gp.Marginal(mean_func2, cov_func2)
223-
224-
# gp represents f1 + f2.
223+
224+
# gp represents f1 + f2.
225225
gp = gp1 + gp2
226-
226+
227227
f = gp.marginal_likelihood("f", X, y, noise)
228-
228+
229229
trace = pm.sample(1000)
230230

231231

232-
To construct the conditional distribution of :code:`gp1` or :code:`gp2`, we
233-
also need to include the additional arguments, :code:`X`, :code:`y`, and
232+
To construct the conditional distribution of :code:`gp1` or :code:`gp2`, we
233+
also need to include the additional arguments, :code:`X`, :code:`y`, and
234234
:code:`noise`::
235235

236236
with model:
237237
# conditional distributions of f1 and f2
238-
f1_star = gp1.conditional("f1_star", X_star,
238+
f1_star = gp1.conditional("f1_star", X_star,
239239
given={"X": X, "y": y, "noise": noise, "gp": gp})
240-
f2_star = gp2.conditional("f2_star", X_star,
240+
f2_star = gp2.conditional("f2_star", X_star,
241241
given={"X": X, "y": y, "noise": noise, "gp": gp})
242242

243243
# conditional of f1 + f2, `given` not required
244244
f_star = gp.conditional("f_star", X_star)
245245

246-
This second block produces the conditional distributions. Notice that extra
246+
This second block produces the conditional distributions. Notice that extra
247247
arguments are required for conditionals of :math:`f1` and :math:`f2`, but not
248-
:math:`f`. This is because those arguments are cached when
248+
:math:`f`. This is because those arguments are cached when
249249
:code:`.marginal_likelihood` is called on :code:`gp`.
250250

251251
.. note::
252252
When constructing conditionals, the additional arguments :code:`X`, :code:`y`,
253253
:code:`noise` and :code:`gp` must be provided as a dict called `given`!
254254

255-
Since the marginal likelihoood method of :code:`gp1` or :code:`gp2` weren't called,
256-
their conditionals need to be provided with the required inputs. In the same
255+
Since the marginal likelihoood method of :code:`gp1` or :code:`gp2` weren't called,
256+
their conditionals need to be provided with the required inputs. In the same
257257
fashion as the prior, :code:`f_star`, :code:`f1_star` and :code:`f2_star` are random
258-
variables that can now be used like any other random variable in PyMC3.
258+
variables that can now be used like any other random variable in PyMC3.
259259

260260
Check the notebooks for detailed demonstrations of the usage of GP functionality
261261
in PyMC3.

0 commit comments

Comments
 (0)