Skip to content

CI: Split CI workflow into code checks, docbuild & web, and arraymanager #45097

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Dec 29, 2021
239 changes: 0 additions & 239 deletions .github/workflows/ci.yml

This file was deleted.

158 changes: 158 additions & 0 deletions .github/workflows/code-checks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,158 @@
name: Code Checks

on:
push:
branches:
- master
- 1.3.x
pull_request:
branches:
- master
- 1.3.x

env:
ENV_FILE: environment.yml
PANDAS_CI: 1

jobs:
pre_commit:
name: pre-commit
runs-on: ubuntu-latest
concurrency:
# https://github.community/t/concurrecy-not-work-for-push/183068/7
group: ${{ github.event_name == 'push' && github.run_number || github.ref }}-pre-commit
cancel-in-progress: true
steps:
- name: Checkout
uses: actions/checkout@v2

- name: Install Python
uses: actions/setup-python@v2
with:
python-version: '3.9.7'

- name: Run pre-commit
uses: pre-commit/[email protected]

typing_and_docstring_validation:
name: Docstring and typing validation
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}

concurrency:
# https://github.community/t/concurrecy-not-work-for-push/183068/7
group: ${{ github.event_name == 'push' && github.run_number || github.ref }}-code-checks
cancel-in-progress: true

steps:
- name: Checkout
uses: actions/checkout@v2
with:
fetch-depth: 0

- name: Cache conda
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: ${{ runner.os }}-conda-${{ hashFiles('${{ env.ENV_FILE }}') }}

- uses: conda-incubator/setup-miniconda@v2
with:
mamba-version: "*"
channels: conda-forge
activate-environment: pandas-dev
channel-priority: strict
environment-file: ${{ env.ENV_FILE }}
use-only-tar-bz2: true

- name: Install node.js (for pyright)
uses: actions/setup-node@v2
with:
node-version: "16"

- name: Install pyright
# note: keep version in sync with .pre-commit-config.yaml
run: npm install -g [email protected]

- name: Build Pandas
id: build
uses: ./.github/actions/build_pandas

- name: Run checks on imported code
run: ci/code_checks.sh code
if: ${{ steps.build.outcome == 'success' }}

- name: Run doctests
run: ci/code_checks.sh doctests
if: ${{ steps.build.outcome == 'success' }}

- name: Run docstring validation
run: ci/code_checks.sh docstrings
if: ${{ steps.build.outcome == 'success' }}

- name: Run typing validation
run: ci/code_checks.sh typing
if: ${{ steps.build.outcome == 'success' }}

- name: Run docstring validation script tests
run: pytest scripts
if: ${{ steps.build.outcome == 'success' }}

asv-benchmarks:
name: ASV Benchmarks
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}

concurrency:
# https://github.community/t/concurrecy-not-work-for-push/183068/7
group: ${{ github.event_name == 'push' && github.run_number || github.ref }}-asv-benchmarks
cancel-in-progress: true

steps:
- name: Checkout
uses: actions/checkout@v2
with:
fetch-depth: 0

- name: Cache conda
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: ${{ runner.os }}-conda-${{ hashFiles('${{ env.ENV_FILE }}') }}

- uses: conda-incubator/setup-miniconda@v2
with:
mamba-version: "*"
channels: conda-forge
activate-environment: pandas-dev
channel-priority: strict
environment-file: ${{ env.ENV_FILE }}
use-only-tar-bz2: true

- name: Build Pandas
id: build
uses: ./.github/actions/build_pandas

- name: Run ASV benchmarks
run: |
cd asv_bench
asv check -E existing
git remote add upstream https://github.com/pandas-dev/pandas.git
git fetch upstream
asv machine --yes
asv dev | sed "/failed$/ s/^/##[error]/" | tee benchmarks.log
if grep "failed" benchmarks.log > /dev/null ; then
exit 1
fi
if: ${{ steps.build.outcome == 'success' }}

- name: Publish benchmarks artifact
uses: actions/upload-artifact@master
with:
name: Benchmarks log
path: asv_bench/benchmarks.log
if: failure()
Loading