Skip to content

Add GitHubAction to auto-update hooks, and other cleanups #4314

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 8, 2020
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,13 @@ coverage:
# basic
target: auto
threshold: 1%
base: auto
base: auto
patch:
default:
# basic
target: 50%
threshold: 1%
base: auto
base: auto

comment:
layout: "reach, diff, flags, files"
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
If you have questions about a specific use case, or you are not sure whether this is a bug or not, please post it to our discourse channel: https://discourse.pymc.io
If you have questions about a specific use case, or you are not sure whether this is a bug or not, please post it to our discourse channel: https://discourse.pymc.io

## Description of your problem

Expand Down
33 changes: 33 additions & 0 deletions .github/workflows/autoupdate-pre-commit-config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: "Update pre-commit config"

on:
schedule:
- cron: "0 7 * * 1" # At 07:00 on each Monday.
workflow_dispatch:

jobs:
update-pre-commit:
if: github.repository_owner == 'pymc-devs'
name: Autoupdate pre-commit config
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v2
- name: Cache multiple paths
uses: actions/cache@v2
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: pre-commit-autoupdate-${{ runner.os }}-build
- name: Update pre-commit config packages
uses: technote-space/create-pr-action@v2
with:
GITHUB_TOKEN: ${{ secrets.ACTION_TRIGGER_TOKEN }}
EXECUTE_COMMANDS: |
pip install pre-commit
pre-commit autoupdate || (exit 0);
pre-commit run -a || (exit 0);
COMMIT_MESSAGE: "⬆️ UPGRADE: Autoupdate pre-commit config"
PR_BRANCH_NAME: "pre-commit-config-update-${PR_ID}"
PR_TITLE: "⬆️ UPGRADE: Autoupdate pre-commit config"
2 changes: 2 additions & 0 deletions .github/workflows/pre-commit.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ on:
jobs:
pre-commit:
runs-on: ubuntu-latest
env:
SKIP: no-commit-to-branch
Comment on lines +11 to +12
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this hook is just so that you don't accidentally make a commit to the master branch when developing locally, it needs to be skipped during CI as after PRs are merged the jobs are run from the master branch

steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ jobs:
--ignore=pymc3/tests/test_shape_handling.py
--ignore=pymc3/tests/test_shared.py
--ignore=pymc3/tests/test_smc.py
--ignore=pymc3/tests/test_step.py
--ignore=pymc3/tests/test_updates.py
--ignore=pymc3/tests/test_variational_inference.py
- |
Expand All @@ -47,6 +48,7 @@ jobs:
- |
pymc3/tests/test_distributions_timeseries.py
pymc3/tests/test_shape_handling.py
pymc3/tests/test_step.py
pymc3/tests/test_updates.py
pymc3/tests/test_variational_inference.py
- |
Expand Down
15 changes: 11 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,24 +1,31 @@
exclude: ^(docs/logos|pymc3/examples/data)/
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.3.0
hooks:
- id: end-of-file-fixer
- id: check-merge-conflict
- id: check-toml
- id: check-yaml
- id: debug-statements
- id: end-of-file-fixer
- id: no-commit-to-branch
args: [--branch, master]
- id: requirements-txt-fixer
- id: trailing-whitespace
- repo: https://github.com/nbQA-dev/nbQA
rev: 0.4.1
rev: 0.5.4
hooks:
- id: nbqa-black
additional_dependencies: [black==20.8b1]
- id: nbqa-isort
additional_dependencies: [isort==5.6.4]
- id: nbqa-pyupgrade
additional_dependencies: [pyupgrade==2.7.4]

- repo: https://github.com/PyCQA/isort
rev: 5.6.4
hooks:
- id: isort
name: isort (python)
name: isort
- repo: https://github.com/asottile/pyupgrade
rev: v2.7.4
hooks:
Expand Down
4 changes: 2 additions & 2 deletions GOVERNANCE.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ developed openly and hosted in public GitHub repositories under the
[GitHub organization](https://github.com/pymc-devs/pymc3). Examples of
Project Software include the PyMC3 code and the Documentation, etc. The Services run by the
Project consist of public websites and web-services that are hosted
at [http://pymc-devs.github.io/pymc3/](http://pymc-devs.github.io/pymc3/)
at [http://pymc-devs.github.io/pymc3/](http://pymc-devs.github.io/pymc3/)
The Project is developed by a team of distributed developers, called
Contributors. Contributors are individuals who have contributed code,
documentation, designs or other work to one or more Project repositories.
Expand Down Expand Up @@ -131,7 +131,7 @@ The current Steering Council membership comprises:
- Junpeng Lao
- Osvaldo Martin
- Austin Rochford
- Adrian Seyboldt
- Adrian Seyboldt
- Thomas Wiecki

### Council membership
Expand Down
8 changes: 4 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ The future of PyMC3 & Theano
There have been many questions and uncertainty around the future of PyMC3 since Theano
stopped getting developed by the original authors, and we started experiments with PyMC4.

We are happy to announce that PyMC3 on Theano (which we are `developing further <https://github.com/pymc-devs/Theano-PyMC>`__)
with a new JAX backend is the future. PyMC4 will not be developed further.
We are happy to announce that PyMC3 on Theano (which we are `developing further <https://github.com/pymc-devs/Theano-PyMC>`__)
with a new JAX backend is the future. PyMC4 will not be developed further.

See the `full announcement <https://pymc-devs.medium.com/the-future-of-pymc3-or-theano-is-dead-long-live-theano-d8005f8a0e9b>`__
for more details.
for more details.

Features
========
Expand Down Expand Up @@ -119,7 +119,7 @@ Another option is to clone the repository and install PyMC3 using
Dependencies
============

PyMC3 is tested on Python 3.6, 3.7, and 3.8 and depends on `Theano-PyMC <https://github.com/pymc-devs/Theano-PyMC>`__,
PyMC3 is tested on Python 3.6, 3.7, and 3.8 and depends on `Theano-PyMC <https://github.com/pymc-devs/Theano-PyMC>`__,
NumPy, SciPy, and Pandas
(see `requirements.txt <https://github.com/pymc-devs/pymc3/blob/master/requirements.txt>`__ for version
information).
Expand Down
2 changes: 1 addition & 1 deletion binder/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
-r ../requirements-dev.txt
# this installs pymc3 itself. it is funny that this is an absolute path,
# this installs pymc3 itself. it is funny that this is an absolute path,
# but reqirements-dev.txt is relative.
.
72 changes: 36 additions & 36 deletions docs/source/Gaussian_Processes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,17 +21,17 @@ choice as priors over functions due to the marginalization and conditioning
properties of the multivariate normal distribution. Usually, the marginal
distribution over :math:`f(x)` is evaluated during the inference step. The
conditional distribution is then used for predicting the function values
:math:`f(x_*)` at new points, :math:`x_*`.
:math:`f(x_*)` at new points, :math:`x_*`.

The joint distribution of :math:`f(x)` and :math:`f(x_*)` is multivariate
normal,

.. math::

\begin{bmatrix} f(x) \\ f(x_*) \\ \end{bmatrix} \sim
\text{N}\left(
\text{N}\left(
\begin{bmatrix} m(x) \\ m(x_*) \\ \end{bmatrix} \,,
\begin{bmatrix} k(x,x') & k(x_*, x) \\
\begin{bmatrix} k(x,x') & k(x_*, x) \\
k(x_*, x) & k(x_*, x_*') \\ \end{bmatrix}
\right) \,.

Expand All @@ -41,21 +41,21 @@ distribution is

.. math::

f(x_*) \mid f(x) \sim \text{N}\left( k(x_*, x) k(x, x)^{-1} [f(x) - m(x)] + m(x_*) ,\,
f(x_*) \mid f(x) \sim \text{N}\left( k(x_*, x) k(x, x)^{-1} [f(x) - m(x)] + m(x_*) ,\,
k(x_*, x_*) - k(x, x_*) k(x, x)^{-1} k(x, x_*) \right) \,.

.. note::

For more information on GPs, check out the book `Gaussian Processes for
Machine Learning <http://www.gaussianprocess.org/gpml/>`_ by Rasmussen &
Williams, or `this introduction <https://www.ics.uci.edu/~welling/teaching/KernelsICS273B/gpB.pdf>`_
Williams, or `this introduction <https://www.ics.uci.edu/~welling/teaching/KernelsICS273B/gpB.pdf>`_
by D. Mackay.

PyMC3 is a great environment for working with fully Bayesian Gaussian Process
models. GPs in PyMC3 have a clear syntax and are highly composable, and many
predefined covariance functions (or kernels), mean functions, and several GP
models. GPs in PyMC3 have a clear syntax and are highly composable, and many
predefined covariance functions (or kernels), mean functions, and several GP
implementations are included. GPs are treated as distributions that can be
used within larger or hierarchical models, not just as standalone regression
used within larger or hierarchical models, not just as standalone regression
models.

Mean and covariance functions
Expand Down Expand Up @@ -83,7 +83,7 @@ specify :code:`input_dim`, the total number of columns of :code:`X`, and
:code:`active_dims`, which of those columns or dimensions the covariance
function will act on, is because :code:`cov_func` hasn't actually seen the
input data yet. The :code:`active_dims` argument is optional, and defaults to
all columns of the matrix of inputs.
all columns of the matrix of inputs.

Covariance functions in PyMC3 closely follow the algebraic rules for kernels,
which allows users to combine covariance functions into new ones, for example:
Expand All @@ -97,13 +97,13 @@ which allows users to combine covariance functions into new ones, for example:


cov_func = pm.gp.cov.ExpQuad(...) * pm.gp.cov.Periodic(...)

- The product (or sum) of a covariance function with a scalar is a
covariance function::


cov_func = eta**2 * pm.gp.cov.Matern32(...)



After the covariance function is defined, it is now a function that is
Expand Down Expand Up @@ -133,7 +133,7 @@ is::
The first argument is the mean function and the second is the covariance
function. We've made the GP object, but we haven't made clear which function
it is to be a prior for, what the inputs are, or what parameters it will be
conditioned on.
conditioned on.

.. note::

Expand All @@ -145,18 +145,18 @@ conditioned on.

Calling the `prior` method will create a PyMC3 random variable that represents
the latent function :math:`f(x) = \mathbf{f}`::

f = gp.prior("f", X)

:code:`f` is a random variable that can be used within a PyMC3 model like any
other type of random variable. The first argument is the name of the random
variable representing the function we are placing the prior over.
The second argument is the inputs to the function that the prior is over,
variable representing the function we are placing the prior over.
The second argument is the inputs to the function that the prior is over,
:code:`X`. The inputs are usually known and present in the data, but they can
also be PyMC3 random variables. If the inputs are a Theano tensor or a
also be PyMC3 random variables. If the inputs are a Theano tensor or a
PyMC3 random variable, the :code:`shape` needs to be given.

Usually at this point, inference is performed on the model. The
Usually at this point, inference is performed on the model. The
:code:`conditional` method creates the conditional, or predictive,
distribution over the latent function at arbitrary :math:`x_*` input points,
:math:`f(x_*)`. To construct the conditional distribution we write::
Expand All @@ -166,7 +166,7 @@ distribution over the latent function at arbitrary :math:`x_*` input points,
Additive GPs
============

The GP implementation in PyMC3 is constructed so that it is easy to define
The GP implementation in PyMC3 is constructed so that it is easy to define
additive GPs and sample from individual GP components. We can write::

gp1 = pm.gp.Marginal(mean_func1, cov_func1)
Expand All @@ -183,18 +183,18 @@ Consider two independent GP distributed functions, :math:`f_1(x) \sim

.. math::

\begin{bmatrix} f_1 \\ f_1^* \\ f_2 \\ f_2^*
\begin{bmatrix} f_1 \\ f_1^* \\ f_2 \\ f_2^*
\\ f_1 + f_2 \\ f_1^* + f_2^* \end{bmatrix} \sim
\text{N}\left(
\text{N}\left(
\begin{bmatrix} m_1 \\ m_1^* \\ m_2 \\ m_2^* \\
m_1 + m_2 \\ m_1^* + m_2^* \\ \end{bmatrix} \,,\,
\begin{bmatrix}
\begin{bmatrix}
K_1 & K_1^* & 0 & 0 & K_1 & K_1^* \\
K_1^{*^T} & K_1^{**} & 0 & 0 & K_1^* & K_1^{**} \\
0 & 0 & K_2 & K_2^* & K_2 & K_2^{*} \\
0 & 0 & K_2^{*^T} & K_2^{**} & K_2^{*} & K_2^{**} \\
K_1 & K_1^{*} & K_2 & K_2^{*} & K_1 + K_2 & K_1^{*} + K_2^{*} \\
K_1^{*^T} & K_1^{**} & K_2^{*^T} & K_2^{**} & K_1^{*^T}+K_2^{*^T} & K_1^{**}+K_2^{**}
K_1^{*^T} & K_1^{**} & K_2^{*^T} & K_2^{**} & K_1^{*^T}+K_2^{*^T} & K_1^{**}+K_2^{**}
\end{bmatrix}
\right) \,.

Expand All @@ -220,42 +220,42 @@ other implementations. The first block fits the GP prior. We denote
with pm.Model() as model:
gp1 = pm.gp.Marginal(mean_func1, cov_func1)
gp2 = pm.gp.Marginal(mean_func2, cov_func2)
# gp represents f1 + f2.

# gp represents f1 + f2.
gp = gp1 + gp2

f = gp.marginal_likelihood("f", X, y, noise)

trace = pm.sample(1000)


To construct the conditional distribution of :code:`gp1` or :code:`gp2`, we
also need to include the additional arguments, :code:`X`, :code:`y`, and
To construct the conditional distribution of :code:`gp1` or :code:`gp2`, we
also need to include the additional arguments, :code:`X`, :code:`y`, and
:code:`noise`::

with model:
# conditional distributions of f1 and f2
f1_star = gp1.conditional("f1_star", X_star,
f1_star = gp1.conditional("f1_star", X_star,
given={"X": X, "y": y, "noise": noise, "gp": gp})
f2_star = gp2.conditional("f2_star", X_star,
f2_star = gp2.conditional("f2_star", X_star,
given={"X": X, "y": y, "noise": noise, "gp": gp})

# conditional of f1 + f2, `given` not required
f_star = gp.conditional("f_star", X_star)

This second block produces the conditional distributions. Notice that extra
This second block produces the conditional distributions. Notice that extra
arguments are required for conditionals of :math:`f1` and :math:`f2`, but not
:math:`f`. This is because those arguments are cached when
:math:`f`. This is because those arguments are cached when
:code:`.marginal_likelihood` is called on :code:`gp`.

.. note::
When constructing conditionals, the additional arguments :code:`X`, :code:`y`,
:code:`noise` and :code:`gp` must be provided as a dict called `given`!

Since the marginal likelihoood method of :code:`gp1` or :code:`gp2` weren't called,
their conditionals need to be provided with the required inputs. In the same
Since the marginal likelihoood method of :code:`gp1` or :code:`gp2` weren't called,
their conditionals need to be provided with the required inputs. In the same
fashion as the prior, :code:`f_star`, :code:`f1_star` and :code:`f2_star` are random
variables that can now be used like any other random variable in PyMC3.
variables that can now be used like any other random variable in PyMC3.

Check the notebooks for detailed demonstrations of the usage of GP functionality
in PyMC3.
Loading