Skip to content

Commit 3fd5c25

Browse files
author
Azure Pipelines
committed
Merge remote-tracking branch 'origin/main' into publication
2 parents 74ff762 + e8a7308 commit 3fd5c25

File tree

23 files changed

+71
-784
lines changed

23 files changed

+71
-784
lines changed

.actions/README.md

Lines changed: 0 additions & 11 deletions
This file was deleted.

.actions/assistant.py

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -648,6 +648,7 @@ def copy_notebooks(
648648
path_docs_images: str = "_static/images",
649649
patterns: Sequence[str] = (".", "**"),
650650
ignore: Optional[Sequence[str]] = None,
651+
strict: bool = True,
651652
) -> None:
652653
"""Copy all notebooks from a folder to doc folder.
653654
@@ -658,12 +659,16 @@ def copy_notebooks(
658659
path_docs_images: destination path to the images' location relative to ``docs_root``
659660
patterns: patterns to use when glob-ing notebooks
660661
ignore: ignore some specific notebooks even when the given string is in path
662+
strict: raise exception if copy fails
661663
662664
"""
663-
all_ipynb = []
664-
for pattern in patterns:
665-
all_ipynb += glob.glob(os.path.join(path_root, DIR_NOTEBOOKS, pattern, "*.ipynb"))
666665
os.makedirs(os.path.join(docs_root, path_docs_ipynb), exist_ok=True)
666+
all_ipynb = [
667+
os.path.realpath(ipynb)
668+
for pattern in patterns
669+
for ipynb in glob.glob(os.path.join(path_root, DIR_NOTEBOOKS, pattern, "*.ipynb"))
670+
]
671+
print(f"Copy following notebooks to docs folder: {all_ipynb}")
667672
if ignore and not isinstance(ignore, (list, set, tuple)):
668673
ignore = [ignore]
669674
elif not ignore:
@@ -683,8 +688,11 @@ def copy_notebooks(
683688
path_docs_images=path_docs_images,
684689
)
685690
except Exception as ex:
686-
warnings.warn(f"Failed to copy notebook: {path_ipynb}\n{ex}", ResourceWarning)
687-
continue
691+
msg = f"Failed to copy notebook: {path_ipynb}\n{ex}"
692+
if not strict:
693+
warnings.warn(msg, ResourceWarning)
694+
continue
695+
raise FileNotFoundError(msg)
688696
ipynb_content.append(os.path.join(path_docs_ipynb, path_ipynb_in_dir))
689697

690698
@staticmethod

.actions/git-diff-sync.sh

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,4 +30,3 @@ git merge --ff -s resolve origin/$1
3030
python _TEMP/.actions/assistant.py group-folders target-diff.txt --fpath_actual_dirs "['dirs-$b1.txt', 'dirs-$b2.txt']"
3131
printf "\n================\nChanged folders:\n----------------\n" && cat changed-folders.txt
3232
printf "\n================\nDropped folders:\n----------------\n" && cat dropped-folders.txt
33-

.actions/requires.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Fire
22
tqdm
3-
PyYAML
3+
PyYAML <5.4 # todo: racing issue with cython compile
44
wcmatch
55
requests
66
pip

.azure/ipynb-validate.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ jobs:
3737
name: mtrx
3838
displayName: "Changed matrix"
3939
- bash: echo '$(mtrx.dirs)' | python -m json.tool
40+
continueOnError: "true" # not crash if the matrix is empty
4041
displayName: "Show matrix"
4142

4243
- job: ipython

.github/dependabot.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ updates:
44
# Enable version updates for python
55
- package-ecosystem: "pip"
66
# Look for a `requirements` in the `root` directory
7-
directory: "/_requirements"
7+
directory: "/"
88
# Check for updates once a week
99
schedule:
1010
interval: "monthly"

.github/workflows/ci_docs.yml

Lines changed: 10 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
name: validate Docs
1+
name: Docs validation
22

33
on: # Trigger the workflow on push or pull request
44
# push:
55
# branches: [main]
66
pull_request: {}
7-
#workflow_dispatch: {}
7+
workflow_dispatch: {}
88

99
concurrency:
1010
group: ${{ github.workflow }}-${{ github.head_ref }}
@@ -20,10 +20,11 @@ jobs:
2020
strategy:
2121
fail-fast: false
2222
matrix:
23-
check: ["html", "linkcheck"]
23+
target: ["html", "linkcheck"]
2424
env:
2525
PUB_BRANCH: publication
2626
PATH_DATASETS: ${{ github.workspace }}/.datasets
27+
TORCH_URL: "https://download.pytorch.org/whl/cpu/torch_stable.html"
2728
timeout-minutes: 20
2829
steps:
2930
- name: Checkout 🛎️
@@ -49,8 +50,8 @@ jobs:
4950
5051
- name: Install dependencies
5152
run: |
52-
pip --version
53-
pip install -q -r requirements.txt -r _requirements/docs.txt
53+
set -ex
54+
pip install -q -r requirements.txt -r _requirements/docs.txt -f ${TORCH_URL}
5455
pip list
5556
5657
- name: Process folders
@@ -97,23 +98,17 @@ jobs:
9798
tree changed-notebooks
9899
99100
- uses: actions/upload-artifact@v3
100-
if: ${{ matrix.check == 'html' && env.NB_DIRS != 0 }}
101+
if: ${{ matrix.target == 'html' && env.NB_DIRS != 0 }}
101102
with:
102103
name: notebooks-${{ github.sha }}
103104
path: changed-notebooks/
104105

105-
- name: Link check
106+
- name: Make ${{ matrix.target }}
106107
working-directory: ./_docs
107-
if: ${{ matrix.check == 'linkcheck' }}
108-
run: make linkcheck --jobs $(nproc) --debug SPHINXOPTS="--keep-going"
109-
110-
- name: Make Documentation
111-
working-directory: ./_docs
112-
if: ${{ matrix.check == 'html' }}
113-
run: make html --jobs $(nproc) --debug SPHINXOPTS="-W --keep-going"
108+
run: make ${{ matrix.target }} --jobs $(nproc) --debug SPHINXOPTS="-W --keep-going"
114109

115110
- name: Upload built docs
116-
if: ${{ matrix.check == 'html' }}
111+
if: ${{ matrix.target == 'html' }}
117112
uses: actions/upload-artifact@v3
118113
with:
119114
name: docs-html-${{ github.sha }}

.github/workflows/ci_internal.yml

Lines changed: 2 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -30,26 +30,14 @@ jobs:
3030
uses: actions/setup-python@v5
3131
with:
3232
python-version: ${{ matrix.python-version }}
33-
34-
# Note: This uses an internal pip API and may not always work
35-
# https://github.com/actions/cache/blob/master/examples.md#multiple-oss-in-a-workflow
36-
- name: Get pip cache dir
37-
id: pip-cache
38-
run: echo "::set-output name=dir::$(pip cache dir)"
39-
40-
- name: pip cache
41-
uses: actions/cache@v3
42-
with:
43-
path: ${{ steps.pip-cache.outputs.dir }}
44-
key: ${{ runner.os }}-pip-py${{ matrix.python-version }}-${{ hashFiles('.actions/requires.txt') }}-${{ hashFiles('requirements/default.txt') }}
45-
restore-keys: ${{ runner.os }}-pip-py${{ matrix.python-version }}-
33+
cache: pip
4634

4735
- name: Install requirements
4836
run: |
49-
pip --version
5037
pip install -q -r .actions/requires.txt -r _requirements/test.txt
5138
# this is needed to be able to run package version parsing test
5239
pip install -q -r _requirements/default.txt --find-links https://download.pytorch.org/whl/cpu/torch_stable.html
40+
pip list
5341
5442
- name: Prepare dummy inputs
5543
run: |

.github/workflows/docker-build.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ jobs:
3434
password: ${{ secrets.DOCKER_PASSWORD }}
3535

3636
- name: Build (and Push) image
37-
uses: docker/build-push-action@v5
37+
uses: docker/build-push-action@v6
3838
with:
3939
#build-args: |
4040
# UBUNTU_VERSION=${{ matrix.ubuntu }}
@@ -44,5 +44,5 @@ jobs:
4444
file: _dockers/ubuntu-cuda/Dockerfile
4545
push: ${{ env.PUSH_DOCKERHUB }}
4646
# todo: publish also tag YYYY.MM
47-
tags: "pytorchlightning/tutorials"
47+
tags: "pytorchlightning/tutorials:cuda"
4848
timeout-minutes: 55

.github/workflows/docs-deploy.yml

Lines changed: 37 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,11 @@ name: Deploy Docs
22
on:
33
push:
44
branches: [publication]
5+
pull_request:
6+
branches: [main]
7+
paths:
8+
- ".actions/assistant.py"
9+
- ".github/workflows/docs-deploy.yml"
510
workflow_dispatch: {}
611
workflow_run:
712
workflows: ["Publish notebook"]
@@ -14,39 +19,59 @@ jobs:
1419
runs-on: ubuntu-20.04
1520
env:
1621
PATH_DATASETS: ${{ github.workspace }}/.datasets
22+
TORCH_URL: "https://download.pytorch.org/whl/cpu/torch_stable.html"
1723
steps:
18-
- name: Checkout 🛎️
24+
- name: Checkout 🛎️ Publication
25+
if: ${{ github.event_name != 'pull_request' }}
1926
uses: actions/checkout@v4
2027
with:
2128
ref: publication
29+
- name: Checkout 🛎️ PR
30+
if: ${{ github.event_name == 'pull_request' }}
31+
uses: actions/checkout@v4
32+
with:
33+
fetch-depth: 0
2234
- uses: actions/setup-python@v5
2335
with:
24-
python-version: 3.8
36+
python-version: "3.9"
37+
cache: pip
38+
- run: pip install -q py-tree
2539

26-
- name: Cache pip
27-
uses: actions/cache@v3
28-
with:
29-
path: ~/.cache/pip
30-
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}-${{ hashFiles('_requirements/docs.txt') }}
31-
restore-keys: ${{ runner.os }}-pip-
40+
- name: pull notebooks from Publication
41+
if: ${{ github.event_name == 'pull_request' }}
42+
run: |
43+
git checkout publication
44+
py-tree .notebooks/
45+
mkdir -p _notebooks
46+
cp -r .notebooks/* _notebooks/
47+
git checkout ${{ github.head_ref }}
48+
cp -r _notebooks/* .notebooks/
49+
50+
- name: List notebooks
51+
run: py-tree .notebooks/
3252

3353
- name: Install dependencies
3454
run: |
3555
mkdir -p ${PATH_DATASETS}
36-
# install Texlive, see https://linuxconfig.org/how-to-install-latex-on-ubuntu-20-04-focal-fossa-linux
37-
sudo apt-get update
56+
sudo apt-get update --fix-missing
3857
sudo apt-get install -y cmake pandoc
58+
# install Texlive, see https://linuxconfig.org/how-to-install-latex-on-ubuntu-20-04-focal-fossa-linux
3959
sudo apt-get install -y texlive-latex-extra dvipng texlive-pictures
40-
pip --version
41-
pip install --quiet --requirement _requirements/docs.txt
60+
pip install -q -r _requirements/docs.txt -f ${TORCH_URL}
4261
pip list
4362
shell: bash
4463

4564
- name: Make Documentation
4665
working-directory: ./_docs
4766
run: make html --jobs $(nproc)
4867

68+
- name: Copied notebooks [debugging]
69+
if: failure()
70+
working-directory: ./_docs/source/
71+
run: py-tree notebooks/
72+
4973
- name: Deploy 🚀
74+
if: ${{ github.event_name != 'pull_request' }}
5075
uses: JamesIves/[email protected]
5176
with:
5277
token: ${{ secrets.GITHUB_TOKEN }}
@@ -55,4 +80,3 @@ jobs:
5580
clean: true # Automatically remove deleted files from the deploy branch
5681
target-folder: docs # If you'd like to push the contents of the deployment folder into a specific directory
5782
single-commit: true # you'd prefer to have a single commit on the deployment branch instead of full history
58-
if: success()

_docs/source/conf.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -43,10 +43,8 @@
4343
# -- Project documents -------------------------------------------------------
4444

4545
AssistantCLI.copy_notebooks(
46-
_PATH_ROOT,
47-
_PATH_HERE,
48-
# ToDo: fix coping this specific notebooks, some JSON encode issue
49-
ignore=["course_UvA-DL/13-contrastive-learning"],
46+
path_root=_PATH_ROOT,
47+
docs_root=_PATH_HERE,
5048
)
5149

5250
# with open(os.path.join(_PATH_HERE, 'ipynb_content.rst'), 'w') as fp:

course_UvA-DL/01-introduction-to-pytorch/notebook.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -373,7 +373,7 @@
373373
# <center style="width: 100%"><img src="comparison_CPU_GPU.png" width="700px"></center>
374374
#
375375
# CPUs and GPUs have both different advantages and disadvantages, which is why many computers contain both components and use them for different tasks.
376-
# In case you are not familiar with GPUs, you can read up more details in this [NVIDIA blog post](https://blogs.nvidia.com/blog/2009/12/16/whats-the-difference-between-a-cpu-and-a-gpu/) or [here](https://www.intel.com/content/www/us/en/products/docs/processors/what-is-a-gpu.html).
376+
# In case you are not familiar with GPUs, you can read up more details in this [NVIDIA blog post](https://blogs.nvidia.com/blog/2009/12/16/whats-the-difference-between-a-cpu-and-a-gpu/) or [here](https://blogs.nvidia.com/blog/whats-the-difference-between-a-cpu-and-a-gpu/).
377377
#
378378
# GPUs can accelerate the training of your network up to a factor of $100$ which is essential for large neural networks.
379379
# PyTorch implements a lot of functionality for supporting GPUs (mostly those of NVIDIA due to the libraries [CUDA](https://developer.nvidia.com/cuda-zone) and [cuDNN](https://developer.nvidia.com/cudnn)).

course_UvA-DL/07-deep-energy-based-generative-models/notebook.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -227,7 +227,7 @@
227227
# if the hyperparameters are not well tuned.
228228
# We will rely on training tricks proposed in the paper
229229
# [Implicit Generation and Generalization in Energy-Based Models](https://arxiv.org/abs/1903.08689)
230-
# by Yilun Du and Igor Mordatch ([blog](https://openai.com/research/energy-based-models)).
230+
# by Yilun Du and Igor Mordatch ([blog](https://openai.com/index/energy-based-models/)).
231231
# The important part of this notebook is however to see how the theory above can actually be used in a model.
232232
#
233233
# ### Dataset

flash_tutorials/electricity_forecasting/.meta.yml

Lines changed: 0 additions & 24 deletions
This file was deleted.

0 commit comments

Comments
 (0)