-
Notifications
You must be signed in to change notification settings - Fork 135
Add building of pyodide universal wheels #918
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
6d1546c
2417282
42b333e
29b59bc
cc39616
5e3bab6
b61d048
1eb1586
68e5338
19cf45e
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -57,6 +57,31 @@ jobs: | |
name: wheels-${{ matrix.platform }} | ||
path: ./wheelhouse/*.whl | ||
|
||
build_universal_wheel: | ||
name: Build universal wheel for Pyodide | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/checkout@v4 | ||
with: | ||
fetch-depth: 0 | ||
|
||
- name: Set up Python | ||
uses: actions/setup-python@v4 | ||
with: | ||
python-version: '3.11' | ||
|
||
- name: Install dependencies | ||
run: pip install numpy versioneer wheel | ||
|
||
- name: Build universal wheel | ||
run: | | ||
PYODIDE=1 python setup.py bdist_wheel --universal | ||
|
||
- uses: actions/upload-artifact@v4 | ||
with: | ||
name: universal_wheel | ||
path: dist/*.whl | ||
|
||
check_dist: | ||
name: Check dist | ||
needs: [make_sdist,build_wheels] | ||
|
@@ -103,6 +128,11 @@ jobs: | |
path: dist | ||
merge-multiple: true | ||
|
||
- uses: actions/download-artifact@v4 | ||
with: | ||
name: universal_wheel | ||
path: dist | ||
|
||
- uses: pypa/[email protected] | ||
with: | ||
user: __token__ | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,6 @@ | ||
#!/usr/bin/env python | ||
import os | ||
|
||
import numpy | ||
import versioneer | ||
from setuptools import Extension, setup | ||
|
@@ -11,17 +13,26 @@ | |
|
||
NAME: str = dist.get_name() # type: ignore | ||
|
||
# Check if building for Pyodide | ||
is_pyodide = os.getenv("PYODIDE", "0") == "1" | ||
|
||
if is_pyodide: | ||
# For pyodide we build a universal wheel that must be pure-python | ||
# so we must omit the cython-version of scan. | ||
ext_modules = [] | ||
else: | ||
ext_modules = [ | ||
Extension( | ||
name="pytensor.scan.scan_perform", | ||
sources=["pytensor/scan/scan_perform.pyx"], | ||
include_dirs=[numpy.get_include()], | ||
), | ||
] | ||
Comment on lines
+19
to
+30
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Depending on an environment variable to conditionally include an extension is very unorthodox and I don't think that it will be robust in the future. As far as I know, there is no canonical way to conditionally include an extension or not when using pyproject.toml or the older setup.py. However, there is a way to get platform conditional requirements following this pep. Maybe we could look at the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm extremely pro-declarative config, so I'd really like to do this better. Unfortunately after taking a brief look through the environment markers, I don't see any particular variables that can help in our situation. This feels not so surprising to me since "universal" vs "platform-specific" seems more like a "build" parameter than an "environment" parameter, making it something to be handled at the build-backend ( Also, unfortunately using |
||
|
||
if __name__ == "__main__": | ||
setup( | ||
name=NAME, | ||
version=versioneer.get_version(), | ||
cmdclass=versioneer.get_cmdclass(), | ||
ext_modules=[ | ||
Extension( | ||
name="pytensor.scan.scan_perform", | ||
sources=["pytensor/scan/scan_perform.pyx"], | ||
include_dirs=[numpy.get_include()], | ||
), | ||
], | ||
ext_modules=ext_modules, | ||
) |
Uh oh!
There was an error while loading. Please reload this page.