-
Notifications
You must be signed in to change notification settings - Fork 124
RHAIENG-1758- Revise Tests for 2025b Onboarding: PandocMissing #2677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughAdds two architecture-aware shell installers for Pandoc and TeX Live (targeting ppc64le) and refactors the UBI9 CPU Dockerfile into a multi-stage build that builds/caches PDF tooling and consolidates package management and dependency installation. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20–30 minutes
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro 📒 Files selected for processing (3)
🚧 Files skipped from review as they are similar to previous changes (2)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (10)
🔇 Additional comments (3)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
cb3b8c7 to
1b7f86e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)
33-34: Removesudoand pin Pandoc version for deterministic builds.GitHub Actions runners have sufficient permissions;
sudois typically unnecessary and can cause issues on certain runners. Additionally, Pandoc's version should be pinned to ensure reproducible builds and prevent unexpected breaking changes.Apply this diff to remove
sudoand pin the version:- - name: Install Pandoc - run: sudo apt-get update && sudo apt-get install -y pandoc + - name: Install Pandoc + run: apt-get install -y pandoc=3.1.*Alternatively, if you need the latest version in a specific major.minor release:
- - name: Install Pandoc - run: sudo apt-get update && sudo apt-get install -y pandoc + - name: Install Pandoc + run: | + apt-get update + apt-get install -y pandoc
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.github/workflows/build-notebooks-push.yaml(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)
33-34: Pandoc installation may be in the wrong job—verify placement and necessity.The Pandoc installation step is added to the
genjob, which only runsgen_gha_matrix_jobs.pyto generate the build matrix. Pandoc is typically needed during actual notebook building, not matrix generation. Confirm whether:
- The Python matrix generation script actually requires Pandoc
- This step should instead be in the
buildjob(s) or in./.github/workflows/build-notebooks-TEMPLATE.yamlIf Pandoc is not needed for matrix generation, this step should be moved to avoid unnecessary overhead and potential ordering issues.
1b7f86e to
57242e6
Compare
57242e6 to
4c3d496
Compare
4c3d496 to
c9cfd1b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)
33-36: Install Pandoc step looks good, minor clarity suggestion onsudousage.The placement and logic are correct—Pandoc is installed after checkout and before the matrix generation step that likely depends on it. The
-yflag for automatic confirmation is appropriate for CI/CD.Minor note:
sudois unnecessary in GitHub Actions ubuntu-latest runners, which typically run as root. You can simplify to justapt-get update && apt-get install -y pandocfor clarity.~ - name: Install Pandoc ~ run: | -~ sudo apt-get update -~ sudo apt-get install -y pandoc +~ apt-get update +~ apt-get install -y pandoc
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.github/workflows/build-notebooks-push.yaml(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)
20-47: Reminder: Complete the PR checklist before requesting review removal.This PR is marked as WIP and the self-checklist indicates you should run
make test(gmakeon macOS) before requesting review, and ensure the merge criteria are met. Please complete these checks before changing the PR status from work-in-progress.
c9cfd1b to
56f3aea
Compare
eb0ff0c to
8503593
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(3 hunks)jupyter/utils/install_pandoc.sh(1 hunks)jupyter/utils/install_texlive.sh(1 hunks)tests/containers/workbenches/jupyterlab/jupyterlab_test.py(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- tests/containers/workbenches/jupyterlab/jupyterlab_test.py
- jupyter/utils/install_texlive.sh
- jupyter/utils/install_pandoc.sh
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (25)
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (6)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (6)
39-42: UBI repository injection is well-motivated.The copy of
ubi.repofrom the UBI 9 image solves the "repo-less" limitation in the AIPCC base image, enabling package management. The documented context (lines 39–41) clearly explains the rationale.
44-50: DNF upgrade flags are appropriate for protected packages.The use of
--nobestand--skip-brokenis justified by the systemd protection issue documented in the comment. This approach is pragmatic, though consider adding monitoring or a follow-up ticket to revisit when the underlying conflict is resolved.
108-122: Python dependency installation consolidation is clean.The merged RUN block (lines 108–122) correctly:
- Uses
uv pip installwith strict validation and no-cache flags- Disables JupyterLab announcements plugin
- Updates kernel.json to reflect Python version
- Applies configuration and fixes permissions in one layer
Ensure that
utils/addons/apply.sh(line 122) is idempotent and handles edge cases gracefully.
96-101: Clarify the architecture-aware conditional logic.The bash conditional uses
uname -mto detect ppc64le architecture. Consider:
- Clarity: Document why ppc64le uses a pre-built cache and other architectures fall back to local install
- Testing: Verify the conditional works correctly on both ppc64le and x86_64 builds
- Robustness: The conditional assumes
install_pdf_deps.shsucceeds; consider adding error handling if the script failsThe current logic is reasonable for build-time optimization (caching pre-compiled binaries on slower ppc64le), but the fallback path should be equally reliable.
126-126: ENTRYPOINT addition verified as valid. Thestart-notebook.shscript exists at the expected location, is executable (permissions:-rwxr-xr-x), and contains valid content (1235 bytes). The ENTRYPOINT configuration is properly supported and will launch JupyterLab by default on container startup.
94-94: Verify Docker buildkit mount cache syntax is supported in your CI/build environment.The RUN directive uses Docker's
--mount=type=cachesyntax (lines 94–101), which requires Docker 24.0+ with buildkit enabled. Confirm that:
- Your CI/build environment supports this feature
- BuildKit is enabled (e.g.,
DOCKER_BUILDKIT=1)- Fallback behavior is tested on systems without this feature
This verification could not be completed in the sandbox environment; please verify in your actual Docker build environment before merging.
| # If pandoc isn't installed in this image, skip instead of failing CI | ||
| if "PandocMissing" in output or "Pandoc wasn't found" in output: | ||
| pytest.skip("Pandoc not available in this notebook image") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the test still valuable, if we skip it in these situations where it was previously failing?
| # Install Pandoc from EPEL repository | ||
| dnf install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm | ||
| dnf install -y pandoc | ||
| pandoc --version |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You seem to be adding a step that installs pandoc where previously it was not installed, and at the same you're adding a skip in a test that was previously checking for pandoc being present.
I'd expect you'd do one or the other, not both, tbh.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes you are correct I will change the test doc back and rebuild.
8503593 to
14d9f8a
Compare
14d9f8a to
31e1b88
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (2)
1-8: Add Docker buildkit syntax directive to support cache mount feature.The Dockerfile uses
RUN --mount=type=cache,from=pdf-builder,...(line 94) which requires Docker buildkit syntax to be recognized. Add the following directive immediately after the file header comment:# syntax=docker/dockerfile:1.4This must appear before any
FROMstatement and ensures the cache mount feature works reliably across CI environments. Without it, builds may silently fail or fall back to non-cached behavior in environments where buildkit isn't explicitly enabled.Apply this diff:
######################### # configuration args # ######################### +# syntax=docker/dockerfile:1.4 ARG BASE_IMAGEEnsure your CI pipeline sets
DOCKER_BUILDKIT=1when building this image to enable buildkit features.
124-126: Fix ENTRYPOINT path: script is copied to/opt/app-root/binbut WORKDIR is changed to/opt/app-root/srcbefore ENTRYPOINT.The ENTRYPOINT at line 126 references
start-notebook.shwithout an absolute path. However, WORKDIR is set to/opt/app-root/srcat line 124 (before the ENTRYPOINT declaration), so at container startup, the shell will look forstart-notebook.shin/opt/app-root/src, where it does not exist. The script is copied to/opt/app-root/bin(line 105).Use an absolute path in the ENTRYPOINT:
-ENTRYPOINT ["start-notebook.sh"] +ENTRYPOINT ["/opt/app-root/bin/start-notebook.sh"]Alternatively, if
start-notebook.shshould be in PATH, verify it's symlinked or copied to a PATH directory (e.g.,/usr/local/bin).
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(3 hunks)jupyter/utils/install_pandoc.sh(1 hunks)jupyter/utils/install_texlive.sh(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- jupyter/utils/install_texlive.sh
- jupyter/utils/install_pandoc.sh
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (25)
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (4)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (4)
39-42: LGTM!The UBI repository injection is correctly implemented and well-justified. The COPY syntax and target path are appropriate.
48-53: LGTM!The dnf upgrade and package installation are appropriately configured with smart flags to handle protected packages and system dependencies, and cleanup is properly integrated.
58-67: LGTM!The micropipenv, uv, and oc client installations are properly versioned and cleanly executed. Cache cleanup and temporary file removal follow best practices.
108-122: Based on my analysis, I need to clarify the actual behavior of the commands in the Dockerfile. Let me reason through this carefully:The review comment claims that
python --version | cut -d '.' -f-2produces "3.12.1" (including patch version), but this appears to be incorrect.Here's the actual behavior:
python --versionoutputs:Python 3.12.1- When piped to
cut -d '.' -f-2:
- The string "Python 3.12.1" is split by the delimiter '.'
- This produces fields:
[1]="Python 3"[2]="12"[3]="1"- The
-2flag means "from field 1 to field 2"- Result:
"Python 3.12"(NOT "3.12.1")The sed replacement
s/Python.*/$(python --version | cut -d '.' -f-2)\",/would:
- Match
Python.*in the JSON line- Replace it with the major.minor version string plus
\",(which closes the JSON string and adds a comma)- Original:
"display_name": "Python 3.12.1",- Result:
"display_name": "Python 3.12",The
\",is the correct JSON syntax (escaped quote to close the string, plus comma as field separator).The original review comment's concern is based on a misunderstanding of how
cut -d '.' -f-2works. The sed pattern appears to be functioning correctly.
| ############################ | ||
| # Stage 1: PDF Tool Build # | ||
| ############################ | ||
| FROM registry.access.redhat.com/ubi9/python-312:latest AS pdf-builder | ||
|
|
||
| WORKDIR /opt/app-root/bin | ||
|
|
||
| # OS Packages needs to be installed as root | ||
| USER 0 | ||
|
|
||
| # Copy scripts | ||
| COPY jupyter/utils/install_texlive.sh ./install_texlive.sh | ||
| COPY jupyter/utils/install_pandoc.sh ./install_pandoc.sh | ||
| RUN chmod +x install_texlive.sh install_pandoc.sh | ||
|
|
||
| RUN ./install_texlive.sh | ||
| RUN ./install_pandoc.sh | ||
|
|
||
| USER 1001 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Script executed:
#!/bin/bash
# Verify the install scripts exist and check their architecture handling logic
echo "=== Checking install_texlive.sh ===" && \
cat ./jupyter/utils/install_texlive.sh && \
echo "" && \
echo "=== Checking install_pandoc.sh ===" && \
cat ./jupyter/utils/install_pandoc.shRepository: opendatahub-io/notebooks
<shell_result>
Repository: opendatahub-io/notebooks
Script execution failed with unrecoverable error: [CAPABILITY:RUN_SHELL_SCRIPT] Failed to clone repository opendatahub-io/notebooks: Clone operation failed
</shell_result>
Script executed:
#!/bin/bash
# Check install scripts for architecture handling
if [ -f "./jupyter/utils/install_texlive.sh" ]; then
echo "=== install_texlive.sh ===" && \
cat ./jupyter/utils/install_texlive.sh
else
echo "install_texlive.sh not found at jupyter/utils/install_texlive.sh"
fi
if [ -f "./jupyter/utils/install_pandoc.sh" ]; then
echo "" && \
echo "=== install_pandoc.sh ===" && \
cat ./jupyter/utils/install_pandoc.sh
else
echo "install_pandoc.sh not found at jupyter/utils/install_pandoc.sh"
fi
# Also check fallback script
if [ -f "./jupyter/utils/install_pdf_deps.sh" ]; then
echo "" && \
echo "=== install_pdf_deps.sh (fallback for non-ppc64le) ===" && \
cat ./jupyter/utils/install_pdf_deps.sh
fiRepository: opendatahub-io/notebooks
<shell_result>
Repository: opendatahub-io/notebooks
Script execution failed with unrecoverable error: [CAPABILITY:RUN_SHELL_SCRIPT] Failed to clone repository opendatahub-io/notebooks: Clone operation failed
</shell_result>
site:github.com opendatahub-io/notebooks jupyter/utils/install_texlive.sh
<web_search_result>
I searched the opendatahub-io/notebooks repo and could not find a file named jupyter/utils/install_texlive.sh on the main branch (no matching path in the repository tree). See the repository index. [1]
If you want, I can:
- search other branches or tags for that file,
- search the whole GitHub for copies of install_texlive.sh,
- or open a specific branch/commit if you paste its name or URL.
Source:
[1] opendatahub-io/notebooks repository (GitHub).
</web_search_result>
site:github.com/opendatahub-io/notebooks "install_texlive.sh"
<web_search_result>
I searched the opendatahub-io/notebooks repository and found no files or references named "install_texlive.sh". [1]
If you meant the script more generally, I found examples of an install_texlive.sh used outside the ODH repo (e.g., in container/Quarto setup instructions). [2]
Sources:
- OpenDataHub notebooks repository (no matching file). [1]
- Example use of install_texlive.sh in container/Quarto instructions. [2]
</web_search_result>
site:github.com/opendatahub-io/notebooks "install_pandoc.sh"
<web_search_result>
I searched the opendatahub-io/notebooks repository for "install_pandoc.sh" and found no files matching that name. (github.com)
Citations:
- 1: https://github.com/opendatahub-io/notebooks?utm_source=openai
</web_search_result>
The Dockerfile references non-existent scripts that will cause the build to fail.
The pdf-builder stage (lines 17–18) attempts to copy install_texlive.sh and install_pandoc.sh from jupyter/utils/, but neither script exists in the repository. The Docker build will fail at the COPY instruction.
Verify:
- Do these scripts exist in this PR branch (they are not in the main repository)?
- If they should exist, create them or remove the
COPYandRUNcommands that reference them (lines 17–20). - If this stage is no longer needed, consider removing the entire pdf-builder stage (lines 9–27) and any downstream references to it.
🤖 Prompt for AI Agents
In jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu around lines 9 to 27 the
pdf-builder stage copies and runs jupyter/utils/install_texlive.sh and
jupyter/utils/install_pandoc.sh which do not exist and will break the build;
either add the missing scripts at jupyter/utils/ with executable permissions and
keep the COPY and RUN lines, or remove the COPY + RUN lines (lines 17–20) if
installing PDF tools is not required, or delete the entire pdf-builder stage
(lines 9–27) and any later FROM --from=pdf-builder references; ensure to
update/remove downstream references and confirm the PR branch contains the
scripts if you choose to add them.
| # Dependencies for PDF export begin | ||
| RUN --mount=type=cache,from=pdf-builder,source=/usr/local/,target=/pdf_builder/,rw \ | ||
| bash -c ' \ | ||
| if [[ "$(uname -m)" == "ppc64le" ]]; then \ | ||
| cp -r /pdf_builder/texlive /usr/local/; \ | ||
| cp -r /pdf_builder/pandoc /usr/local/; \ | ||
| else \ | ||
| ./utils/install_pdf_deps.sh; \ | ||
| fi' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Script executed:
#!/bin/bash
# Locate and inspect install_pdf_deps.sh
# Find the script
echo "=== Searching for install_pdf_deps.sh ===" && \
fd -t f 'install_pdf_deps.sh' && \
echo "" && \
echo "=== Content of install_pdf_deps.sh ===" && \
cat ./jupyter/utils/install_pdf_deps.shRepository: opendatahub-io/notebooks
<shell_result>
Repository: opendatahub-io/notebooks
Script execution failed with unrecoverable error: [CAPABILITY:RUN_SHELL_SCRIPT] Failed to clone repository opendatahub-io/notebooks: Clone operation failed
</shell_result>
Script executed:
#!/bin/bash
# Use GitHub CLI to search for install_pdf_deps.sh in the repository
echo "=== Searching for install_pdf_deps.sh using GitHub CLI ===" && \
gh repo clone opendatahub-io/notebooks --depth=1 2>&1 | head -20 || \
gh api repos/opendatahub-io/notebooks/contents/jupyter/utils/install_pdf_deps.sh 2>&1 || \
gh search code --repo opendatahub-io/notebooks "install_pdf_deps" --match file 2>&1 | head -30Repository: opendatahub-io/notebooks
<shell_result>
Repository: opendatahub-io/notebooks
Script execution failed with unrecoverable error: [CAPABILITY:RUN_SHELL_SCRIPT] Failed to clone repository opendatahub-io/notebooks: Clone operation failed
</shell_result>
install_pdf_deps.sh opendatahub-io/notebooks jupyter/utils
<web_search_result>
Do you want the file's location, its contents, or instructions to run/install it?
</web_search_result>
site:github.com/opendatahub-io/notebooks "install_pdf_deps.sh"
<web_search_result>
I searched GitHub for that exact filename under opendatahub-io/notebooks and found no matches. Try broadening the search (remove site: filter, search the org repo list, or try variants like install-pdf-deps.sh or install_pdf_deps).
</web_search_result>
site:github.com/opendatahub-io/notebooks pdf texlive pandoc install
<web_search_result>
Search results:
- The opendatahub-io/notebooks repo itself has no bundled PDF build instructions — it’s a collection of notebook images and README only. [1]
- Common solutions used in GitHub projects: (a) install pandoc plus a TeX distribution (TeX Live or BasicTeX) or (b) use a Docker image that bundles pandoc+TeXLive (e.g., pandoc-all-in-one). [2][3]
- Useful minimal TeXLive package lists / scripts for Pandoc->PDF workflows are available (example gist showing required TeXLive packages). [4]
Quick recommendations (pick one):
- Linux (system packages)
- Install pandoc and required TeX Live packages:
- Debian/Ubuntu example:
sudo apt-get update
sudo apt-get install -y pandoc texlive-latex-base texlive-latex-extra texlive-fonts-recommended texlive-fonts-extra
(This is the common approach used in many repos; adapt packages to needs). [4]
- Debian/Ubuntu example:
- macOS
- Install pandoc via Homebrew/cabal and BasicTeX / MacTeX for pdflatex support (then use tlmgr to add missing packages). [4]
- Container / reproducible approach
- Use a prebuilt Docker image containing pandoc + TeXLive (e.g., jakobkmar/pandoc-all-in-one or other pandoc/latex-derived images) and run pandoc inside the container (keeps notebook images small and reproducible). [2]
Notes / sources:
- opendatahub-io/notebooks (repository searched). [1]
- pandoc + TeXLive all-in-one Docker images (example: pandoc-all-in-one). [2]
- Minimal TeXLive package lists / install scripts for pandoc -> PDF (example gists showing needed tlmgr/apt packages). [4]
If you want, I can:
- produce an apt / yum / tlmgr install script tuned to your OS, or
- prepare a Dockerfile that adds pandoc + the minimal TeXLive packages needed for PDF generation.
</web_search_result>
site:github.com/opendatahub-io/notebooks jupyter/utils bash script
<web_search_result>
Do you want me to search that repository for bash scripts (files under jupyter or a jupyter/utils path) and list links/paths?
</web_search_result>
site:github.com/opendatahub-io/notebooks/tree/main/jupyter/utils
<web_search_result>
I found the GitHub directory you requested: opendatahub-io/notebooks — jupyter/utils. View it on GitHub here. (github.com) [1]
Citations:
- 1: https://github.com/opendatahub-io/notebooks/tree/main/jupyter/utils
</web_search_result>
site:github.com/opendatahub-io/notebooks/blob/main/jupyter/utils/ filetype:sh
<web_search_result>
I searched GitHub for shell scripts at that path and did not find any .sh files under jupyter/utils in the opendatahub-io/notebooks repository — the repository root and file tree are here. [1]
Sources
[1] opendatahub-io/notebooks (GitHub repository root).
</web_search_result>
install_pdf_deps.sh does not exist in the repository, causing all non-ppc64le architecture builds to fail.
GitHub search confirms no .sh files exist in the jupyter/utils directory. The Dockerfile at lines 93–101 references ./utils/install_pdf_deps.sh as a fallback for non-ppc64le architectures (x86_64, aarch64, s390x), but this script is missing. The conditional will succeed only for ppc64le (which uses the cache mount); all other architectures will fail with "file not found" when attempting to execute the non-existent script.
Create jupyter/utils/install_pdf_deps.sh or update the Dockerfile to use an alternative approach for non-ppc64le systems.
🤖 Prompt for AI Agents
In jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu around lines 93 to 101 the
fallback references ./utils/install_pdf_deps.sh which does not exist and causes
non-ppc64le builds to fail; fix by either adding
jupyter/utils/install_pdf_deps.sh (make it executable) that installs the same
PDF dependencies as the ppc64le cached /pdf_builder (install texlive, pandoc and
any required packages and exit non-zero on failure), or update the Dockerfile to
replace the ./utils/install_pdf_deps.sh call with an inline set of installation
commands (or a different existing script) that perform the PDF dependency
installation for x86_64, aarch64 and s390x; ensure the chosen file/path is
present in the repo and referenced with the correct relative path in the
Dockerfile.
31e1b88 to
3d737a7
Compare
|
@dibryant: The following tests failed, say
Full PR test history. Your PR dashboard. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here. |
Fixes for https://issues.redhat.com/browse/RHAIENG-1758
Description
Updated GA with missing Pandoc
How Has This Been Tested?
Self checklist (all need to be checked):
make test(gmakeon macOS) before asking for reviewDockerfile.konfluxfiles should be done inodh/notebooksand automatically synced torhds/notebooks. For Konflux-specific changes, modifyDockerfile.konfluxfiles directly inrhds/notebooksas these require special attention in the downstream repository and flow to the upcoming RHOAI release.Merge criteria:
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.