Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
ab509d2
Add support for multi-platform wheels and enhance tarball handling
cxzhong Nov 7, 2025
f6d35f6
Fix dependency specification for jsonschema by replacing 'pyrsistent'…
cxzhong Nov 7, 2025
1ac314f
Refactor tarball.py: remove unused import and improve logging messages
cxzhong Nov 7, 2025
838c4ca
Fix dependency specification for jsonschema by restoring 'pyrsistent'
cxzhong Nov 7, 2025
72a0c1c
Enhance package.py and tarball.py: implement subprocess calls to fetc…
cxzhong Nov 7, 2025
da8cbf3
Enhance tarball selection in package.py: utilize packaging.tags for i…
cxzhong Nov 7, 2025
aba2315
Update dependency specification for rpds_py to include 'packaging' fo…
cxzhong Nov 7, 2025
1479c77
Remove platform-specific wheel checks from Package and Tarball classes
cxzhong Nov 7, 2025
e9a145c
Refactor tarball parsing in package.py to utilize subprocess for pack…
cxzhong Nov 8, 2025
5e3fd07
Use compatible write for old python
cxzhong Nov 11, 2025
dc2ed41
write the comment
cxzhong Nov 11, 2025
b51fe59
Remove 'text=True' from subprocess calls
cxzhong Nov 11, 2025
475db59
Merge branch 'develop' into Add_rpds_py
cxzhong Nov 12, 2025
f102156
Add support for downloading all wheels for distribution in Tarball class
cxzhong Nov 13, 2025
3e7f0ff
Update release.yml
cxzhong Nov 13, 2025
0378ce7
Update release.yml
cxzhong Nov 13, 2025
af01d2c
Apply suggestions from code review
cxzhong Nov 13, 2025
d7ff54e
Refactor tarball selection logic to prioritize wheel tarballs and imp…
cxzhong Nov 13, 2025
5a6baed
Apply suggestions from code review
cxzhong Nov 13, 2025
d52bc6b
Merge branch 'Add_rpds_py' of github.com:cxzhong/sage into Add_rpds_py
cxzhong Nov 13, 2025
448f5ee
Remove redundant json import and add it at the top of the find_tarbal…
cxzhong Nov 13, 2025
7d67167
Reorder import statements in find_tarball_for_platform method for cla…
cxzhong Nov 13, 2025
a032ac8
pass package rpds_py in the disable-notebook
cxzhong Nov 16, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 3 additions & 14 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,24 +38,13 @@ jobs:
run: |
sudo DEBIAN_FRONTEND=noninteractive apt-get update
sudo DEBIAN_FRONTEND=noninteractive apt-get install $(build/bin/sage-get-system-packages debian _bootstrap _prereq)
- name: make dist (--disable-download-from-upstream-url)
id: make_dist
run: |
./bootstrap -D && ./configure --disable-download-from-upstream-url && make dist
env:
MAKE: make -j8
- name: make download (--disable-download-from-upstream-url)
id: make_download
run: |
make -k download DOWNLOAD_PACKAGES=":all: --no-file huge"
env:
MAKE: make -j8
- name: Reconfigure with --enable-download-from-upstream-url
if: (success() || failure()) && (steps.make_dist.outcome != 'success' || steps.make_download.outcome != 'success')
run: |
make configure
./configure
- name: make dist (--enable-download-from-upstream-url)
if: (success() || failure()) && steps.make_dist.outcome != 'success'
id: make_dist
if: success() || failure()
run: |
make dist
env:
Expand Down
2 changes: 1 addition & 1 deletion build/make/Makefile.in
Original file line number Diff line number Diff line change
Expand Up @@ -280,7 +280,7 @@ all-sage-docs: $(SAGE_DOCS_INSTALLED_PACKAGE_INSTS) $(SAGE_DOCS_UNINSTALLED_PAC
# Download all packages which should be inside an sdist tarball (the -B
# option to make forces all targets to be built unconditionally)
download-for-sdist:
+env SAGE_INSTALL_FETCH_ONLY=yes $(MAKE_REC) -B SAGERUNTIME= \
+env SAGE_INSTALL_FETCH_ONLY=yes SAGE_DOWNLOAD_ALL_WHEELS=yes $(MAKE_REC) -B SAGERUNTIME= \
$(SDIST_PACKAGES)

# TOOLCHAIN consists of dependencies determined by configure.
Expand Down
2 changes: 1 addition & 1 deletion build/pkgs/jsonschema/dependencies
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
jsonschema_specifications pyrsistent attrs fqdn isoduration jsonpointer uri_template webcolors | $(PYTHON_TOOLCHAIN) $(PYTHON)
jsonschema_specifications pyrsistent rpds_py attrs fqdn isoduration jsonpointer uri_template webcolors | $(PYTHON_TOOLCHAIN) $(PYTHON)

----------
All lines of this file are ignored except the first.
59 changes: 59 additions & 0 deletions build/pkgs/rpds_py/SPKG.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
rpds_py: Python bindings to Rust's persistent data structures
==============================================================

Description
-----------

Python bindings to the Rust rpds crate for persistent data structures.

rpds-py provides efficient, immutable data structures including:

* ``HashTrieMap`` - Persistent hash map
* ``HashTrieSet`` - Persistent hash set
* ``List`` - Persistent list with efficient operations

These data structures are backed by Rust implementations for high performance
while maintaining a Pythonic API. They are particularly useful for functional
programming patterns and situations requiring immutable, persistent collections.

The library is used by projects like the referencing library (part of the
Python JSON Schema ecosystem) as a faster alternative to pyrsistent.

License
-------

MIT License

Upstream Contact
----------------

- Author: Julian Berman <[email protected]>
- Home page: https://github.com/crate-py/rpds
- PyPI: https://pypi.org/project/rpds-py/
- Documentation: https://rpds.readthedocs.io/
- Upstream Rust crate: https://github.com/orium/rpds

Dependencies
------------

Python (>= 3.10)

Build dependencies: Rust toolchain (automatically handled by pip when
installing from source)

Special Notes
-------------

This package provides platform-specific binary wheels for multiple Python
versions and platforms:

* Python 3.11, 3.12, 3.13, 3.14 (including free-threaded 3.13t and 3.14t)
* Linux (x86_64, aarch64, musllinux)
* macOS (x86_64, arm64)
* Windows (win32, win_amd64, win_arm64)

The Sage build system automatically selects and downloads the appropriate
wheel for your platform and Python version using the packaging library's compatibility tags.

When building from source, a Rust toolchain is required as rpds-py contains
Rust extensions for performance.
208 changes: 208 additions & 0 deletions build/pkgs/rpds_py/checksums.ini

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions build/pkgs/rpds_py/dependencies
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
pip packaging
1 change: 1 addition & 0 deletions build/pkgs/rpds_py/package-version.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
0.28.0
1 change: 1 addition & 0 deletions build/pkgs/rpds_py/type
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
standard
202 changes: 196 additions & 6 deletions build/sage_bootstrap/package.py
Original file line number Diff line number Diff line change
Expand Up @@ -241,6 +241,159 @@ def tarball_upstream_url(self):
else:
return None

@property
def tarballs_info(self):
"""
Return information about all tarballs for this package.

This supports packages with multiple platform-specific wheels.

OUTPUT:

List of dictionaries, each containing:
- 'tarball': tarball filename pattern
- 'sha256': SHA256 checksum
- 'sha1': SHA1 checksum (optional)
- 'upstream_url': upstream URL pattern
"""
return self.__tarballs_info

def find_tarball_for_platform(self):
"""
Find the appropriate tarball for the current platform.

For packages with multiple platform-specific wheels, this selects
the one matching the current platform and Python version using
the packaging.tags module to ensure compatibility.

Properly handles wheel ABI tags:
- cp313-cp313 (CPython 3.13 specific)
- cp313-cp313t (CPython 3.13 free-threaded/nogil)
- cp313-abi3 (stable ABI, forward compatible)
- py3-none-any (universal pure Python wheel)
- pp39-pypy39_pp73 (PyPy wheels)

INPUT:

- ``python_version`` -- Python version string (e.g., '3.11'), or None to auto-detect

OUTPUT:

Dictionary with tarball info, or None if no suitable tarball found.
The dictionary contains the same fields as tarballs_info entries.
"""
import json
import subprocess

from sage_bootstrap.env import SAGE_ROOT

if not self.__tarballs_info:
return None

# If only one tarball, return it
if len(self.__tarballs_info) == 1:
return self.__tarballs_info[0]

# Get compatible tags from Sage's Python using packaging.tags
sage_script = os.path.join(SAGE_ROOT, 'sage')
if not os.path.exists(sage_script):
raise RuntimeError('Sage script not found at: {0}'.format(sage_script))

try:
# Get all compatible tags from Sage's Python
result = subprocess.run(
[sage_script, '-python', '-c',
'import packaging.tags; import json; tags = [str(t) for t in packaging.tags.sys_tags()]; print(json.dumps(tags))'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
timeout=10,
cwd=SAGE_ROOT
)
if result.returncode != 0:
raise RuntimeError('Failed to get compatible tags from sage -python: {0}'.format(result.stderr))

compatible_tags = json.loads(result.stdout.strip())

except subprocess.TimeoutExpired:
raise RuntimeError('Timeout while querying compatible tags via ./sage -python')
except Exception as e:
raise RuntimeError('Error querying compatible tags via ./sage -python: {0}'.format(str(e)))

# Convert tags list to a set for fast lookup with priority
# Lower index = higher priority
tag_priority = {tag: idx for idx, tag in enumerate(compatible_tags)}

# Separate wheels from non-wheel tarballs
wheel_tarballs = []
source_tarballs = []
for tarball_info in self.__tarballs_info:
if tarball_info['tarball'].endswith('.whl'):
wheel_tarballs.append(tarball_info)
else:
source_tarballs.append(tarball_info)

# Batch parse all wheel filenames in a single subprocess call
# This avoids subprocess overhead for packages with many wheels
wheel_tags_map = {}
if wheel_tarballs:
try:
wheel_filenames = [info['tarball'] for info in wheel_tarballs]
python_code = 'import packaging.utils as pu,json,sys;d={};[exec(f"try: d[f]=[str(t)for t in pu.parse_wheel_filename(f)[3]]\\nexcept: d[f]=None",{"f":f,"d":d,"pu":pu})for f in sys.argv[1:]];print(json.dumps(d))'
result = subprocess.run(
[sage_script, '-python', '-c', python_code] + wheel_filenames,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
timeout=30,
cwd=SAGE_ROOT
)
if result.returncode != 0:
log.warning(f'Failed to parse wheel filenames: {result.stderr}')
else:
wheel_tags_map = json.loads(result.stdout.strip())

except subprocess.TimeoutExpired:
log.warning('Timeout while parsing wheel filenames')
except Exception as e:
log.warning(f'Error parsing wheel filenames: {e}')

# Find the best matching tarball
best_match = None
best_priority = float('inf')

# Check wheel tarballs first (they have higher priority than source)
for tarball_info in wheel_tarballs:
tarball = tarball_info['tarball']
wheel_tags_str = wheel_tags_map.get(tarball)

if wheel_tags_str is None:
log.debug(f'Could not parse wheel filename {tarball}')
continue

# Check each tag in the wheel (multi-platform wheels have multiple tags)
for wheel_tag_str in wheel_tags_str:
if wheel_tag_str in tag_priority:
priority = tag_priority[wheel_tag_str]
if priority < best_priority:
best_match = tarball_info
best_priority = priority
log.debug(f'Found compatible wheel: {tarball} with tag {wheel_tag_str} (priority: {priority})')
break # Found a match, no need to check other tags for this wheel

# If no wheel matched, consider source distributions
if best_match is None and source_tarballs:
best_match = source_tarballs[0]
best_priority = len(compatible_tags) + 1000

if best_match:
log.debug(f'Selected {best_match["tarball"]} with priority {best_priority}')
return best_match

# If no match found, return the first one (backward compatibility)
log.warning(f'No exact platform match found for {self.name}, using first tarball')
return self.__tarballs_info[0]

@property
def tarball_package(self):
"""
Expand Down Expand Up @@ -507,24 +660,61 @@ def line_count_file(self, filename):
def _init_checksum(self):
"""
Load the checksums from the appropriate ``checksums.ini`` file

Supports multiple tarballs with format:
tarball=package-VERSION-cp311-cp311-manylinux_2_17_x86_64.whl
sha256=abc123...
upstream_url=https://...
tarball=package-VERSION-cp311-cp311-macosx_11_0_arm64.whl
sha256=def456...
upstream_url=https://...
"""
checksums_ini = os.path.join(self.path, 'checksums.ini')
assignment = re.compile('(?P<var>[a-zA-Z0-9_]*)=(?P<value>.*)')
result = dict()

# Store all entries, supporting multiple values for tarball, sha256, upstream_url
tarballs = []
sha256s = []
sha1s = []
upstream_urls = []

try:
with open(checksums_ini, 'rt') as f:
for line in f.readlines():
match = assignment.match(line)
if match is None:
continue
var, value = match.groups()
result[var] = value

# Collect multiple entries
if var == 'tarball':
tarballs.append(value)
elif var == 'sha256':
sha256s.append(value)
elif var == 'sha1':
sha1s.append(value)
elif var == 'upstream_url':
upstream_urls.append(value)
except IOError:
pass
self.__sha1 = result.get('sha1', None)
self.__sha256 = result.get('sha256', None)
self.__tarball_pattern = result.get('tarball', None)
self.__tarball_upstream_url_pattern = result.get('upstream_url', None)

# Store all tarballs info
self.__tarballs_info = []
for i, tarball in enumerate(tarballs):
info = {
'tarball': tarball,
'sha256': sha256s[i] if i < len(sha256s) else None,
'sha1': sha1s[i] if i < len(sha1s) else None,
'upstream_url': upstream_urls[i] if i < len(upstream_urls) else None,
}
self.__tarballs_info.append(info)
Comment on lines +703 to +710
Copy link

Copilot AI Nov 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential bug: The code assumes that checksums.ini entries are ordered such that the i-th sha256 corresponds to the i-th tarball. However, if a line in checksums.ini has only a tarball without a corresponding sha256 (or vice versa), the indices will be misaligned. For example:

tarball=file1.whl
tarball=file2.whl
sha256=abc123

Would incorrectly assign sha256=abc123 to tarball=file1.whl (index 0) instead of file2.whl. Consider grouping entries by blank lines or using a state machine to ensure correct pairing of tarball/sha256/sha1/upstream_url entries.

Copilot uses AI. Check for mistakes.

# For backward compatibility, set the first tarball as primary
self.__sha1 = sha1s[0] if sha1s else None
self.__sha256 = sha256s[0] if sha256s else None
self.__tarball_pattern = tarballs[0] if tarballs else None
self.__tarball_upstream_url_pattern = upstream_urls[0] if upstream_urls else None

# Name of the directory containing the checksums.ini file
self.__tarball_package_name = os.path.realpath(checksums_ini).split(os.sep)[-2]

Expand Down
Loading
Loading