-
Notifications
You must be signed in to change notification settings - Fork 60
[WIP] hermetic build: Non binary #858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughAdds a new PipelineRun parameter Changes
Sequence Diagram(s)(Skipped — criteria for generating a sequence diagram are not met.) Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (3)
README.md (1)
18-94: Consider fixing markdown list indentation for consistency.Markdownlint reports unordered list indentation inconsistencies in the table of contents. While not critical, consistent formatting improves maintainability.
Expected indentation is 2 spaces per level, but some entries use 4 spaces. You can run
markdownlint-cli2 --fix README.mdto automatically correct these, or adjust manually:
- Lines 19, 55-57, 90-94: Reduce indentation to match expected levels (2/4/6 spaces instead of 4/8/12)
scripts/remove_torch_deps.sh (2)
6-6: Consider addingset -ufor stricter error handling.While
set -eis present, addingset -uwould catch undefined variable references.Apply this diff:
-set -e +set -eu
16-19: Consider warning if backup already exists.The script silently overwrites any existing backup file. Consider adding a check to warn users.
Apply this diff:
# Create a backup BACKUP_FILE="${INPUT_FILE}.backup" +if [ -f "$BACKUP_FILE" ]; then + echo "Warning: Overwriting existing backup: $BACKUP_FILE" +fi cp "$INPUT_FILE" "$BACKUP_FILE" echo "Created backup: $BACKUP_FILE"
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (12)
.tekton/lightspeed-stack-pull-request.yaml(2 hunks).tekton/lightspeed-stack-push.yaml(2 hunks)Containerfile(1 hunks)Makefile(2 hunks)README.md(9 hunks)pyproject.toml(1 hunks)requirements.hermetic.txt(1 hunks)requirements.torch.txt(1 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(1 hunks)scripts/remove_torch_deps.sh(1 hunks)ubi.repo(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfilepyproject.toml
📚 Learning: 2025-08-18T11:45:59.961Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:77-77
Timestamp: 2025-08-18T11:45:59.961Z
Learning: torch==2.7.1 is available on PyPI and is a valid version that can be used in dependency specifications.
Applied to files:
requirements.torch.txt
📚 Learning: 2025-08-18T10:56:55.349Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:0-0
Timestamp: 2025-08-18T10:56:55.349Z
Learning: The lightspeed-stack project intentionally uses a "generic image" approach, bundling many dependencies directly in the base runtime image to work for everyone, rather than using lean base images with optional dependency groups.
Applied to files:
README.md
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use Python package manager `uv` with `uv run` prefix for all development commands
Applied to files:
requirements.hermetic.txt
🪛 markdownlint-cli2 (0.18.1)
README.md
19-19: Unordered list indentation
Expected: 2; Actual: 4
(MD007, ul-indent)
55-55: Unordered list indentation
Expected: 6; Actual: 12
(MD007, ul-indent)
56-56: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
57-57: Unordered list indentation
Expected: 6; Actual: 12
(MD007, ul-indent)
90-90: Unordered list indentation
Expected: 2; Actual: 4
(MD007, ul-indent)
91-91: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
92-92: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
93-93: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build-pr
- GitHub Check: e2e_tests (ci)
- GitHub Check: e2e_tests (azure)
🔇 Additional comments (19)
pyproject.toml (1)
90-94: LGTM! Grouped PyTorch source configuration.The addition of
group = "llslibdev"correctly restricts the PyTorch CPU index to the llslibdev dependency group, aligning with the hermetic build strategy introduced in this PR.requirements.torch.txt (1)
1-24: LGTM! Autogenerated PyTorch CPU requirements for hermetic builds.This file correctly provides pinned torch==2.7.1+cpu with integrity hashes for reproducible, hermetic builds across multiple platforms.
README.md (1)
940-985: LGTM! Comprehensive Konflux hermetic build documentation.The new Konflux section clearly documents the hermetic build workflow, dependency update process, and provides actionable commands for maintaining lock files. This aligns well with the hermetic build infrastructure introduced in this PR.
ubi.repo (1)
1-62: LGTM! Standard UBI 9 repository configuration.This YUM repository configuration correctly defines the BaseOS, AppStream, and CodeReady Builder repositories for UBI 9, with appropriate GPG verification enabled and debug/source repos disabled by default.
rpms.in.yaml (1)
1-4: LGTM! Clean RPM dependency specification for multi-arch builds.The package list (gcc, jq, patch) and architecture configuration (x86_64, aarch64) align well with the hermetic build strategy and multi-platform support introduced in this PR.
rpms.lock.yaml (1)
1-355: LGTM! Autogenerated RPM lockfile for hermetic builds.This lockfile provides reproducible, verified package resolution for multi-arch builds. It should be regenerated using
rpm-lockfile-prototypewhen dependencies change, as documented in the README..tekton/lightspeed-stack-push.yaml (4)
27-30: LGTM! ARM64 platform support added for multi-arch builds.The addition of
linux-c6gd2xlarge/arm64enables ARM64 builds alongside x86_64, aligning with the multi-architecture lockfiles (rpms.lock.yaml, requirements.aarch64.txt) introduced in this PR.
35-36: LGTM! Hermetic mode enabled for air-gapped builds.Setting
hermetic: 'true'enforces network isolation during builds, ensuring reproducibility with prefetched dependencies.
632-634: Extended timeouts for hermetic multi-arch builds.The 4-hour timeout accommodates the increased duration of hermetic builds with dependency prefetching, multi-architecture compilation, and comprehensive security scanning. This aligns with the expanded build complexity introduced in this PR.
31-36: Verify all referenced requirements files exist before merging.The prefetch configuration references four requirements files (requirements.x86_64.txt, requirements.aarch64.txt, requirements.hermetic.txt, requirements.torch.txt) that must exist at pipeline execution time. Confirm these files are present in the repository or are generated by the build system (e.g., via
make konflux-requirementsor similar build target) before this configuration is used.Containerfile (2)
25-25: LGTM!The glob pattern
requirements.*.txtcorrectly matches all platform-specific and hermetic requirements files needed for the multi-architecture build.
32-36: Verify requirements files exist for all target architectures.The hermetic build path uses
$(uname -m)to select architecture-specific requirements files. Ensure thatrequirements.x86_64.txt,requirements.aarch64.txt, andrequirements.torch.txtexist in the repository root for all supported build architectures..tekton/lightspeed-stack-pull-request.yaml (3)
30-33: LGTM!The addition of
linux-c6gd2xlarge/arm64correctly adds ARM64 platform support for multi-architecture builds. The c6gd2xlarge instance type is appropriate for ARM64 builds.
635-637: Verify 4-hour timeout is appropriate.The pipeline and task timeouts are set to 4 hours, which is significantly longer than typical CI builds. While multi-arch builds with hermetic prefetching can be time-consuming, verify this duration is necessary and consider if it can be optimized.
Monitor actual build times to determine if this timeout can be reduced in future iterations.
36-37: Verifyallow_binary: falsecompatibility with PyTorch wheels.The
prefetch-inputconfiguration sets"allow_binary": "false"for pip dependencies. If this parameter restricts downloads to source distributions only, it may conflict with PyTorch wheels referenced inrequirements.torch.txt. Clarify whether cachi2'sallow_binarysetting permits wheel downloads, or if the configuration should be adjusted to"allow_binary": "true"or removed to allow mixed binary/source packages for this use case.scripts/remove_torch_deps.sh (1)
48-55: LGTM!The atomic file replacement using a temporary file and the final diff output are well-implemented. The
|| trueon the diff command correctly prevents the script from failing when differences are found.Makefile (3)
7-8: No action required. PyTorch version 2.7.1 is available on PyPI and supports CPU wheels.
104-109: Verify duplicate target definitions in Makefile.The review comment references a potential duplicate
konflux-requirementstarget definition, but the provided code snippet shows only one definition (lines 104-109). Manual verification is needed to confirm whether duplicate target definitions exist elsewhere in the Makefile, as repository access for automated verification was unavailable.
104-109: The--torch-backend cpu,--emit-index-url, and--no-depsflags are all validuv pip compileoptions.
--torch-backendis a documented uv feature for specifying PyTorch backend indexes (cpu, cu128, etc.).--emit-index-urlis documented for including index URLs in generated requirements.--no-depsis a standard pip compatibility flag supported by uv.The design here—generating platform-specific requirements with full dependencies, removing torch from those files, then generating a separate torch-only file with
--no-deps—is intentional and correct. PyTorch dependencies not needed for the platform builds are isolated in the separaterequirements.torch.txtfile.Likely an incorrect or invalid review comment.
| # If we are in torch section and line starts with 4 spaces, skip it | ||
| in_torch_section == 1 && /^ / { | ||
| next # Skip this line | ||
| } | ||
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | ||
| in_torch_section == 1 && !/^ / { | ||
| in_torch_section = 0 | ||
| # Fall through to print this line | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential issue with empty lines in torch section.
The awk script exits the torch section when encountering a line that doesn't start with 4 spaces. This includes empty lines, which might appear between the torch== line and its hash lines in some requirements files.
Consider updating the pattern to handle empty lines:
# If we are in torch section and line starts with 4 spaces, skip it
in_torch_section == 1 && /^ / {
next # Skip this line
}
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+ next # Skip empty line
+}
+
# If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^ / {
+in_torch_section == 1 && !/^ / && !/^$/ {
in_torch_section = 0
# Fall through to print this line
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # If we are in torch section and line starts with 4 spaces, skip it | |
| in_torch_section == 1 && /^ / { | |
| next # Skip this line | |
| } | |
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | |
| in_torch_section == 1 && !/^ / { | |
| in_torch_section = 0 | |
| # Fall through to print this line | |
| } | |
| # If we are in torch section and line starts with 4 spaces, skip it | |
| in_torch_section == 1 && /^ / { | |
| next # Skip this line | |
| } | |
| # If we are in torch section and line is empty, skip it | |
| in_torch_section == 1 && /^$/ { | |
| next # Skip empty line | |
| } | |
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | |
| in_torch_section == 1 && !/^ / && !/^$/ { | |
| in_torch_section = 0 | |
| # Fall through to print this line | |
| } |
🤖 Prompt for AI Agents
In scripts/remove_torch_deps.sh around lines 33 to 42, the awk logic treats any
line that doesn't start with four spaces (including blank lines) as the end of
the torch section; update the condition so blank lines are ignored (i.e., do not
terminate the section). Change the exit check to only leave the torch section
when the line is non-empty and does not start with four spaces (for example: if
in_torch_section == 1 && $0 !~ /^ / && $0 !~ /^$/ then set in_torch_section =
0), and ensure blank lines inside the torch block are either skipped or allowed
to fall through as part of the section so hash lines remain associated with the
torch entry.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/remove_torch_deps.sh (1)
33-42: Apply the empty-line fix to both torch and faiss-cpu removal sections.The awk state machine uses
!/^ /to exit each section (lines 39 and 67), but this pattern also matches empty lines, causing the section to exit prematurely when blank lines appear between package declarations and their hash continuation lines. Subsequent hash lines are then left unremoved.This issue was flagged in a prior review but not resolved. The new faiss-cpu removal logic (lines 50–76) replicates the same pattern.
Apply this fix to the torch section (around line 39):
# If we are in torch section and line starts with 4 spaces, skip it in_torch_section == 1 && /^ / { next # Skip this line } +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + # If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { +in_torch_section == 1 && !/^ / && !/^$/ { in_torch_section = 0 # Fall through to print this line }Apply the same fix to the faiss-cpu section (around line 67).
🧹 Nitpick comments (1)
Makefile (1)
9-10: Align faiss-cpu segregation strategy with documented dependency control.Based on learnings from prior PRs,
faiss-cpuwas intentionally pinned as a top-level dependency to maintain control over the dependency graph and avoid version conflicts when bundling ML/LLM packages. This change segregates faiss-cpu intorequirements.binary.txt, which may affect dependency resolution and conflict management.Ensure that this segregation is intentional and documented in ADRs or migration notes, and verify that the dependency graph remains stable after removal from standard requirements files.
Would you like me to generate a verification script to analyze the dependency graph impact, or help document the rationale for this segregation in an ADR?
Also applies to: 111-112
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (8)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(1 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements.aarch64.txt(0 hunks)requirements.x86_64.txt(0 hunks)scripts/remove_torch_deps.sh(1 hunks)
💤 Files with no reviewable changes (2)
- requirements.aarch64.txt
- requirements.x86_64.txt
🚧 Files skipped from review as they are similar to previous changes (4)
- Containerfile
- pyproject.toml
- .tekton/lightspeed-stack-pull-request.yaml
- .tekton/lightspeed-stack-push.yaml
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
Makefile
🔇 Additional comments (4)
scripts/remove_torch_deps.sh (2)
50-76: Verify consistency with learnings on faiss-cpu pinning.The script now removes faiss-cpu as a top-level dependency. According to learnings from prior work in this repo, faiss-cpu is intentionally pinned as a top-level dependency to maintain control over the dependency graph and avoid version conflicts in ML/LLM tooling. Ensure this removal aligns with broader changes in the build pipeline that handle faiss-cpu separately (as mentioned in the PR objectives).
79-85: Validate temporary file cleanup and backup retention.The script chains two awk passes through temporary files (.tmp and .tmp2) and then replaces the original. Cleanup and backup handling appear correct: the script removes .tmp after chaining, preserves the backup, and runs a final diff for verification. No concerns here.
Makefile (2)
112-112: Verify pybuild-deps is available and correctly configured.The new
pybuild-deps compilestep assumes the tool is installed and available in theuvenvironment. Ensure that this tool is:
- Added to the project dependencies or dev dependencies
- Compatible with the compiled requirements files from lines 106–111
- Properly documented so developers understand the build-time dependency workflow
9-10: No action needed. FAISS_VERSION1.13.0is confirmed as the latest stable release as of December 2025, making the version choice correct and appropriate.
Makefile
Outdated
| ./scripts/remove_torch_deps.sh requirements.x86_64.txt | ||
| ./scripts/remove_torch_deps.sh requirements.aarch64.txt | ||
| echo "torch==${TORCH_VERSION}" | uv pip compile - -o requirements.torch.txt --generate-hashes --python-version 3.12 --torch-backend cpu --emit-index-url --no-deps --index-url https://download.pytorch.org/whl/cpu | ||
| echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
uv pip compile --universal flag documentation behavior
💡 Result:
Short answer — uv pip compile --universal tells uv to perform a universal (platform‑independent) resolution and emit a requirements-style output with environment markers (and possibly multiple entries per package for different platforms). Universal resolution is more constrained than platform-specific resolution, may include markers on lines, and can produce multiple versions/URLs for the same project to cover different platforms. Known caveats: it can be stricter (fail when a package’s python-requires is incompatible across targets) and there are open/closed issues about marker handling and python-version interactions. (glama.ai)
If you want docs/links to read: uv universal resolution docs and related GitHub issues above. (glama.ai)
Citations:
- 1: https://glama.ai/mcp/servers/%40KemingHe/python-dependency-manager-companion-mcp-server/blob/3fd4939760278d56c4913802df692bb104bbe59e/src/assets/uv/concepts/resolution.md?utm_source=openai
- 2: https://glama.ai/mcp/servers/%40KemingHe/python-dependency-manager-companion-mcp-server/blob/3fd4939760278d56c4913802df692bb104bbe59e/src/assets/uv/concepts/resolution.md?utm_source=openai
🌐 Web query:
faiss-cpu wheel distribution universal wheels support
💡 Result:
Short answer: faiss-cpu does NOT provide "universal" (pure‑Python none‑any / py2.py3) wheels. The project publishes platform‑specific binary wheels (manylinux/musllinux macOS Windows; CPython ABI tags or abi3 where applicable) built with cibuildwheel, so you must install the wheel that matches your Python version and platform (or build from source). (pypi.org)
Citations:
🌐 Web query:
pip compile --universal vs --python-version conflict
💡 Result:
Short answer
--universal tells pip-compile to produce a requirements file that is valid across Python versions by preserving PEP 508 markers instead of resolving/installing for the current interpreter. --python-version (or using a different Python to run pip-compile) affects which Python environment the resolver actually uses when picking compatible package versions. That mismatch can make the resolver either (a) emit markers that will never be installable for the specified python version, or (b) fail because a dependency's python_requires makes no compatible release available for the environment used to resolve. [1][2][3]
Why the conflict happens (concise)
- pip-compile resolves dependencies using the Python interpreter/environment it runs in (and pip-tools documents that results can differ by OS/Python). If you ask for a “universal” output but you run resolution with a different Python version, pip-compile will either (i) keep markers (so the produced lines are conditional and might not actually install under the target interpreter), or (ii) fail if the resolver finds no compatible package versions for the interpreter it used. [1][2][3]
What to do
- Run pip-compile with the actual target Python version (e.g., run it inside that interpreter/venv). This avoids the mismatch. [1][3]
- If you need one file for many Pythons, use a tool that compiles per-Python and merges results (e.g., pip-compile-universal) or generate separate requirements files per Python and merge carefully. [4][2]
- Use explicit PEP 508 environment markers in your inputs so pip-compile can keep correct conditional requirements rather than trying to force-resolve incompatible releases. [1]
Sources
- pip-tools documentation (behavior and cross-environment note). [1]
- pip-tools GitHub issues describing lack of support for pip "--python-version" and cross-compile problems. [2][3]
- pip-compile-universal (tool that compiles per-Python and merges). [4]
Remove --universal flag; faiss-cpu requires platform-specific wheels tied to Python version.
The --universal flag is incompatible with faiss-cpu, which distributes only platform-specific binary wheels (manylinux/musllinux/macOS/Windows) built for specific Python versions and platforms. Using --universal alongside --python-version 3.12 creates a semantic conflict: universal resolution produces cross-Python environment markers, but faiss-cpu binary wheels are strictly version-locked and platform-specific. Use only --python-version 3.12 without --universal to ensure the resolver generates the correct binary wheel specification.
🤖 Prompt for AI Agents
In Makefile around line 111, the command uses the incompatible --universal flag
with faiss-cpu; remove the --universal flag from the uv pip compile invocation
and keep --python-version 3.12 (and other flags) so the resolver produces
platform-specific binary wheel spec for faiss-cpu.
120c2c6 to
b65f9d5
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
scripts/remove_torch_deps.sh (1)
50-85: Refactor duplicated removal logic into a reusable function.The torch removal (lines 22-48) and faiss removal (lines 50-76) follow nearly identical patterns. This violates DRY and makes future maintenance harder. Consider refactoring to a helper function:
remove_section() { local file="$1" local package_pattern="$2" local var_name="$3" awk -v pkg="$package_pattern" -v var="$var_name" ' BEGIN { section_active = 0 } $0 ~ "^" pkg "==" { section_active = 1 next } section_active == 1 && /^$/ { next } section_active == 1 && /^ / { next } section_active == 1 && !/^ / && !/^$/ { section_active = 0 } section_active == 0 { print } ' "$file" }Then call:
remove_section "$INPUT_FILE" "torch==" "in_torch_section" > "${INPUT_FILE}.tmp" remove_section "${INPUT_FILE}.tmp" "faiss-cpu==" "in_faiss_section" > "${INPUT_FILE}.tmp2"
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (10)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(1 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(0 hunks)requirements.binary.txt(1 hunks)requirements.x86_64.txt(0 hunks)scripts/remove_torch_deps.sh(1 hunks)
💤 Files with no reviewable changes (2)
- requirements.x86_64.txt
- requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
- requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (4)
- Containerfile
- .tekton/lightspeed-stack-pull-request.yaml
- Makefile
- pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
🔇 Additional comments (1)
.tekton/lightspeed-stack-push.yaml (1)
34-34: Verify referenced requirement files exist and are maintained.Line 34 references several requirement files in the prefetch configuration:
requirements.hermetic.txtrequirements.torch.txtrequirements-binary.txtThese files are not provided in the PR context. Ensure that:
- All referenced files are actually generated/maintained by your build process
- The purpose of each file is clear (e.g., is
requirements.torch.txtstill needed if torch is in the binary block?)- There are no redundant requirement files being referenced
The prefetch configuration should only reference files that are actively maintained and necessary for the build.
| beniget==0.4.2.post1 | ||
| # via pythran | ||
| calver==2025.10.20 | ||
| # via trove-classifiers | ||
| cffi==2.0.0 | ||
| # via cryptography | ||
| coherent-licensed==0.5.2 | ||
| # via zipp | ||
| cppy==1.3.1 | ||
| # via kiwisolver | ||
| cython==3.1.7 | ||
| # via | ||
| # scikit-learn | ||
| # scipy | ||
| cython==3.2.2 | ||
| # via | ||
| # asyncpg | ||
| # lxml | ||
| # numpy | ||
| # pandas | ||
| # pyarrow | ||
| # pyyaml | ||
| # sqlalchemy | ||
| dunamai==1.25.0 | ||
| # via uv-dynamic-versioning | ||
| expandvars==1.1.2 | ||
| # via | ||
| # frozenlist | ||
| # propcache | ||
| # yarl | ||
| flit-core==3.12.0 | ||
| # via | ||
| # aiosqlite | ||
| # click | ||
| # coherent-licensed | ||
| # idna | ||
| # jinja2 | ||
| # markdown-it-py | ||
| # mdurl | ||
| # packaging | ||
| # pathspec | ||
| # pyparsing | ||
| # pyproject-metadata | ||
| # threadpoolctl | ||
| # typing-extensions | ||
| # wheel | ||
| gast==0.6.0 | ||
| # via | ||
| # beniget | ||
| # pythran | ||
| hatch-fancy-pypi-readme==25.1.0 | ||
| # via | ||
| # attrs | ||
| # httpcore | ||
| # httpx | ||
| # jsonschema | ||
| # llama-stack-client | ||
| # openai | ||
| # pydantic | ||
| hatch-vcs==0.4.0 | ||
| # via urllib3 | ||
| hatch-vcs==0.5.0 | ||
| # via | ||
| # attrs | ||
| # filelock | ||
| # fsspec | ||
| # jsonschema | ||
| # jsonschema-specifications | ||
| # referencing | ||
| # scikit-build-core | ||
| # termcolor | ||
| hatchling==1.26.3 | ||
| # via | ||
| # hatch-fancy-pypi-readme | ||
| # llama-stack-client | ||
| # openai | ||
| hatchling==1.28.0 | ||
| # via | ||
| # annotated-types | ||
| # attrs | ||
| # dnspython | ||
| # expandvars | ||
| # filelock | ||
| # fsspec | ||
| # hatch-fancy-pypi-readme | ||
| # hatch-vcs | ||
| # httpcore | ||
| # httpx | ||
| # jsonschema | ||
| # jsonschema-specifications | ||
| # mcp | ||
| # opentelemetry-api | ||
| # opentelemetry-exporter-otlp | ||
| # opentelemetry-exporter-otlp-proto-common | ||
| # opentelemetry-exporter-otlp-proto-grpc | ||
| # opentelemetry-exporter-otlp-proto-http | ||
| # opentelemetry-instrumentation | ||
| # opentelemetry-proto | ||
| # opentelemetry-sdk | ||
| # opentelemetry-semantic-conventions | ||
| # pydantic | ||
| # pydantic-settings | ||
| # pygments | ||
| # python-multipart | ||
| # referencing | ||
| # scikit-build-core | ||
| # starlette | ||
| # termcolor | ||
| # typing-inspection | ||
| # urllib3 | ||
| # uv-dynamic-versioning | ||
| # uvicorn | ||
| jinja2==3.1.6 | ||
| # via uv-dynamic-versioning | ||
| markupsafe==3.0.3 | ||
| # via jinja2 | ||
| maturin==1.10.2 | ||
| # via | ||
| # cryptography | ||
| # fastuuid | ||
| # hf-xet | ||
| # jiter | ||
| # pydantic-core | ||
| # rpds-py | ||
| # safetensors | ||
| # tokenizers | ||
| meson-python==0.16.0 | ||
| # via matplotlib | ||
| meson-python==0.18.0 | ||
| # via | ||
| # contourpy | ||
| # numpy | ||
| # pandas | ||
| # scikit-learn | ||
| # scipy | ||
| meson==1.9.2 | ||
| # via | ||
| # contourpy | ||
| # meson-python | ||
| # pandas | ||
| numpy==2.2.6 | ||
| # via | ||
| # pandas | ||
| # pyarrow | ||
| # scikit-learn | ||
| # scipy | ||
| packaging==25.0 | ||
| # via | ||
| # hatchling | ||
| # meson-python | ||
| # scikit-build-core | ||
| # setuptools-scm | ||
| pathspec==0.12.1 | ||
| # via | ||
| # hatchling | ||
| # scikit-build-core | ||
| pdm-backend==2.4.6 | ||
| # via | ||
| # annotated-doc | ||
| # fastapi | ||
| pkgconfig==1.5.5 | ||
| # via aiohttp | ||
| pluggy==1.6.0 | ||
| # via hatchling | ||
| ply==3.11 | ||
| # via pythran | ||
| poetry-core==2.2.1 | ||
| # via | ||
| # aiohappyeyeballs | ||
| # dunamai | ||
| # litellm | ||
| # pkgconfig | ||
| # rich | ||
| # rsa | ||
| # tomlkit | ||
| pybind11==3.0.1 | ||
| # via | ||
| # contourpy | ||
| # matplotlib | ||
| # pillow | ||
| # scipy | ||
| pycparser==2.23 | ||
| # via cffi | ||
| pyproject-metadata==0.10.0 | ||
| # via meson-python | ||
| pythran==0.18.1 | ||
| # via scipy | ||
| scikit-build-core==0.11.6 | ||
| # via pybind11 | ||
| scipy==1.16.3 | ||
| # via scikit-learn | ||
| semantic-version==2.10.0 | ||
| # via setuptools-rust | ||
| setuptools-rust==1.12.0 | ||
| # via | ||
| # maturin | ||
| # tiktoken | ||
| setuptools-scm==7.1.0 | ||
| # via python-dateutil | ||
| setuptools-scm==9.2.2 | ||
| # via | ||
| # anyio | ||
| # cppy | ||
| # hatch-vcs | ||
| # httpx-sse | ||
| # importlib-metadata | ||
| # kiwisolver | ||
| # matplotlib | ||
| # pluggy | ||
| # pyarrow | ||
| # setuptools-rust | ||
| # sniffio | ||
| # tqdm | ||
| # zipp | ||
| tomlkit==0.13.3 | ||
| # via uv-dynamic-versioning | ||
| trove-classifiers==2025.12.1.14 | ||
| # via hatchling | ||
| typing-extensions==4.15.0 | ||
| # via setuptools-scm | ||
| uv-dynamic-versioning==0.11.2 | ||
| # via mcp | ||
| versioneer==0.29 | ||
| # via pandas | ||
| wheel==0.45.1 | ||
| # via | ||
| # asyncpg | ||
| # authlib | ||
| # cachetools | ||
| # cppy | ||
| # fire | ||
| # httpx-sse | ||
| # litellm | ||
| # meson | ||
| # pandas | ||
| # python-dateutil | ||
| # python-jose | ||
| # sentence-transformers | ||
| # tiktoken | ||
| # tqdm | ||
| # tzdata | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, find and read the requirements-build.txt file
find . -name "requirements-build.txt" -type fRepository: lightspeed-core/lightspeed-stack
Length of output: 208
🌐 Web query:
pybuild-deps tool multiple versions same package requirements
💡 Result:
Do you mean pybuild-deps (the PEP‑517 build-deps helper) and how it handles duplicate/contradictory version specifiers for the same package in requirements? If so, tell me which behavior you want to see (how it currently resolves duplicates, how to produce a single pinned requirement, or an example input/output) and I’ll fetch the exact pybuild-deps docs/behavior and show a concrete example with citations.
🌐 Web query:
pybuild-deps pip-compile multiple versions output format
💡 Result:
Short answer: pybuild-deps does not itself emit separate pip-compile outputs per Python version — its compile command writes a single requirements-style file (build-requirements.txt) with options to add annotations and pip-style hashes. To get pip-compile outputs for multiple Python versions run pip-compile under each interpreter (pythonX.Y -m piptools compile) or use a helper like pip-compile-multi to produce/versioned outputs for several Python versions.
Sources:
- pybuild-deps CLI compile options (annotate, annotation-style, --generate-hashes, --output-file). [pybuild-deps docs]. [1]
- pip-tools / pip-compile recommendation to run under each Python version. [pip-tools docs]. [2]
- pip-compile-multi can manage compiling requirement files for multiple Python versions/environments. [pip-compile-multi features]. [3]
References:
[1] https://pybuild-deps.readthedocs.io/en/latest/usage.html
[2] https://pip-tools.readthedocs.io/en/stable/
[3] https://pip-compile-multi.readthedocs.io/en/latest/features.html
🌐 Web query:
pybuild-deps merge multiple architecture requirement files conflict resolution
💡 Result:
Do you want (A) concrete commands/steps to merge pybuild-deps-generated per-architecture requirements into a single requirements file, or (B) guidance/strategies for resolving version conflicts that show up when merging?
🌐 Web query:
pip-compile multiple versions same package in output requirements file
💡 Result:
Short answer: pip-compile (pip-tools) will not produce two different pinned versions of the same package in a single output requirements file for one installation environment. You must represent multiple versions by using separate requirement sets (files) or environment-conditional pins/constraints. Suggested approaches:
-
Create separate outputs (one pinned file per variant) and compile each from its own .in (or from .in + different constraints). Example: requirements-3.9.in -> requirements-3.9.txt and requirements-3.11.in -> requirements-3.11.txt. (Common solution for testing a library against multiple dependency versions.) [1][2]
-
Use constraint files (-c) to force one compiled file to constrain another. Compile the “base” file first, then compile the variant with -c base.txt so the variant’s requirements respect the base pins. This avoids duplicate/conflicting pins in one file. [1][4]
-
Use tooling that manages multiple compiled outputs/variants (e.g. pip-compile-multi) to automate cross-file compilation and resolving cross-file conflicts. It can compile multiple .in → .txt and auto-resolve cross-file constraints. [3]
Notes:
- You cannot have two conflicting versions of the same package in one single requirements.txt for one environment; pip (and pip-tools) installs one version per environment. Use separate environments/files to install different versions. [1][2]
If you want, tell me how you want the variants expressed (different Python versions, test vs prod, two specific versions of a dependency) and I’ll give exact .in/.txt examples and the pip-compile commands.
Sources: pip-tools docs and PyPI, pip-compile-multi docs. [1] [2] [3]
Duplicate package versions in requirements-build.txt will cause pip install failures.
The file contains multiple pinned versions of the same packages, which is invalid for pip requirements files:
cython: 3.1.7 and 3.2.2hatch-vcs: 0.4.0 and 0.5.0hatchling: 1.26.3 and 1.28.0meson-python: 0.16.0 and 0.18.0setuptools-scm: 7.1.0 and 9.2.2
A single requirements file cannot specify conflicting versions. The pybuild-deps compilation process needs to resolve these conflicts by either selecting one version per package or using separate requirement files for each architecture variant rather than merging them into a single output.
🤖 Prompt for AI Agents
In requirements-build.txt (lines 7-248) there are duplicate pinned versions for
the same packages (cython: 3.1.7 & 3.2.2; hatch-vcs: 0.4.0 & 0.5.0; hatchling:
1.26.3 & 1.28.0; meson-python: 0.16.0 & 0.18.0; setuptools-scm: 7.1.0 & 9.2.2)
which will break pip; resolve by keeping a single pinned version per package
(prefer the newest compatible release), remove the duplicates from this file, or
split architecture/variant-specific pins into separate requirement files instead
of merging them; update the dependency export/generator that produced this file
to dedupe by package name (selecting the chosen version) so future outputs do
not reintroduce conflicting entries.
| awk ' | ||
| BEGIN { | ||
| in_faiss_section = 0 | ||
| } | ||
| # If we find a line starting with faiss-cpu== | ||
| /^faiss-cpu==/ { | ||
| in_faiss_section = 1 | ||
| next # Skip this line | ||
| } | ||
| # If we are in faiss section and line starts with 4 spaces, skip it | ||
| in_faiss_section == 1 && /^ / { | ||
| next # Skip this line | ||
| } | ||
| # If we are in faiss section and line does NOT start with 4 spaces, exit faiss section | ||
| in_faiss_section == 1 && !/^ / { | ||
| in_faiss_section = 0 | ||
| # Fall through to print this line | ||
| } | ||
| # Print all lines that are not part of faiss section | ||
| in_faiss_section == 0 { | ||
| } | ||
| ' "${INPUT_FILE}.tmp" > "${INPUT_FILE}.tmp2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replicated unfixed empty-line handling in faiss-cpu removal logic.
The newly added faiss-cpu removal (lines 50-76) uses identical awk logic to the torch removal but reproduces the same issue flagged in the previous review (lines 33-42). Line 67 exits the faiss section when encountering any line that doesn't start with 4 spaces, including empty lines:
in_faiss_section == 1 && !/^ / {
in_faiss_section = 0If empty lines appear between faiss-cpu==<version> and its hash lines, the script will incorrectly exit the section and leave orphaned hash lines behind. This creates a maintenance issue: you now have two nearly-identical blocks with the same latent bug.
Suggested fix: Apply the correction to both torch and faiss sections by checking for empty lines explicitly:
-# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
-in_faiss_section == 1 && !/^ / {
+# If we are in faiss section and line is empty, skip it
+in_faiss_section == 1 && /^$/ {
+ next # Skip empty line
+}
+
+# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
+in_faiss_section == 1 && !/^ / && !/^$/ {
in_faiss_section = 0
# Fall through to print this line
}And apply the same pattern to the torch section (lines 33-42) to fix the previously flagged issue.
🤖 Prompt for AI Agents
In scripts/remove_torch_deps.sh around lines 50 to 76 (and similarly update the
torch block at lines 33 to 42), the awk logic exits the faiss-cpu section on any
non-indented line which incorrectly treats empty lines as section terminators
and leaves orphaned hash lines; change the exit condition so the section only
ends when a line is neither indented (doesn't start with four spaces) nor empty
(non-blank), i.e., explicitly treat blank lines as part of the section and only
reset in_faiss_section when encountering a non-empty, non-indented line; apply
the same exact fix to the torch removal block to remove the duplicated bug.
e762f1c to
cde758a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
pyproject.toml (1)
59-71: Correct pyright exclusion path to match coding guidelinesThe coding guidelines specify excluding
src/auth/k8s.pyfrom pyright type checking, but the current config excludessrc/authentication/k8s.py. Update the path in pyproject.toml:[tool.pyright] exclude = [ - "src/authentication/k8s.py", + "src/auth/k8s.py", # Agent API v1 endpoints - deprecated API but still supported
♻️ Duplicate comments (1)
scripts/remove_torch_deps.sh (1)
33-42: Blank-line handling in torch/faiss awk blocks still breaks hash removalBoth torch and faiss sections still exit on any non-indented line, including blanks, so an empty line between the package line and its hash lines will prematurely end the section and leave orphaned hashes. This is the same issue noted in the earlier review and now affects faiss-cpu as well.
Consider updating both blocks to treat empty lines as part of the section and only exit on non-empty, non-indented lines:
@@ -# If we are in torch section and line starts with 4 spaces, skip it -in_torch_section == 1 && /^ / { - next # Skip this line -} - -# If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { - in_torch_section = 0 - # Fall through to print this line -} +# If we are in torch section and line starts with 4 spaces, skip it +in_torch_section == 1 && /^ / { + next # Skip this line +} + +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + +# If we are in torch section and line is non-empty and does NOT start with 4 spaces, exit torch section +in_torch_section == 1 && !/^ / && !/^$/ { + in_torch_section = 0 + # Fall through to print this line +} @@ -# If we are in faiss section and line starts with 4 spaces, skip it -in_faiss_section == 1 && /^ / { - next # Skip this line -} - -# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section -in_faiss_section == 1 && !/^ / { - in_faiss_section = 0 - # Fall through to print this line -} +# If we are in faiss section and line starts with 4 spaces, skip it +in_faiss_section == 1 && /^ / { + next # Skip this line +} + +# If we are in faiss section and line is empty, skip it +in_faiss_section == 1 && /^$/ { + next # Skip empty line +} + +# If we are in faiss section and line is non-empty and does NOT start with 4 spaces, exit faiss section +in_faiss_section == 1 && !/^ / && !/^$/ { + in_faiss_section = 0 + # Fall through to print this line +}Also applies to: 50-76
🧹 Nitpick comments (1)
scripts/remove_torch_deps.sh (1)
3-4: Script description and success message now under-report behavior (faiss-cpu also removed)The header comment and final echo only mention torch, but the script now also strips faiss-cpu blocks. This can confuse users reading logs or the script header.
Consider updating both to reflect current behavior, e.g.:
-# Script to remove torch dependencies from requirements.txt -# Detects torch==<version> and removes it along with all subsequent lines starting with 4 spaces +# Script to remove torch and faiss-cpu dependencies from requirements.txt +# Detects torch==<version> and faiss-cpu==<version> and removes them along with all subsequent lines starting with 4 spaces @@ -echo "Successfully removed torch dependencies from $INPUT_FILE" -echo "Original file backed up to $BACKUP_FILE" +echo "Successfully removed torch and faiss-cpu dependencies from $INPUT_FILE" +echo "Original file backed up to $BACKUP_FILE"Also applies to: 79-85
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (11)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(1 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(6 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(6 hunks)scripts/remove_torch_deps.sh(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
- Makefile
- requirements-build.txt
- requirements.binary.txt
- requirements.aarch64.txt
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfilepyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones
Applied to files:
pyproject.toml
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / azure
🔇 Additional comments (10)
.tekton/lightspeed-stack-pull-request.yaml (1)
37-37: PR pipeline prefetch JSON mirrors push pipeline; verify shared config stays consistentThe
prefetch-inputvalue here matches the push pipeline's Pip prefetch config (same requirements files andbinarymetadata), which is good for behavior parity between PR and push runs.As with the push pipeline, please:
- Confirm
requirements.binary.txtexists and is the intended file name.- Keep this JSON in sync with the push pipeline when adding/removing requirements or binary packages.
requirements.hermetic.txt (1)
1-6: Explicit hermetic tool pins look good; ensure versions stay aligned with your toolchainCommented header plus exact versions for
uv,pip,patchelf, andcmakematch the hermetic intent and are easy to audit.Keep these versions in sync with:
- The
uvversion you install in the builder image- Any expectations from your build/prefetch tasks
- Other version pins in your configuration
.tekton/lightspeed-stack-push.yaml (1)
34-34: Ensurerequirements.binary.txtandrequirements-build.txtexist and are used consistentlyThe updated
prefetch-inputJSON structure is valid, but verify that:
requirements.binary.txtexists at the repository rootrequirements-build.txtexists at the repository root- Both filenames match their references in the Containerfile pip install paths
pyproject.toml (1)
103-127: Align pip/pybuild-deps tooling versions across dev and hermetic environmentsAdding
pybuild-deps==0.5.0and pinnedpip==24.3.1to thedevgroup should be verified for consistency withrequirements.hermetic.txt. If the hermetic environment uses a different pip version, this divergence could lead to different resolver behavior between CI and local dev.Consider either:
- Aligning pip versions between
pyproject.tomldev andrequirements.hermetic.txt, or- Documenting why they intentionally differ.
Also verify that the pyright exclude configuration in
[tool.pyright]correctly references the path tosrc/auth/k8s.pyas specified in the coding guidelines.requirements.x86_64.txt (6)
1931-1941: faiss-cpu hermetic split — reconcile "via faiss-cpu" residue.If faiss-cpu is intentionally moved to requirements.binary.txt, regenerate this non-binary lock so "# via faiss-cpu" no longer appears under numpy, or confirm it's still a resolver input for this group. Also ensure faiss-cpu remains top-level pinned in the binary manifest to keep the dep graph stable across arches. Based on project design, faiss-cpu is intentionally pinned top-level to maintain control over the dependency graph.
1866-1868: networkx patch to 3.6.1.Patch-level bump released Dec 08, 2025 with new spectral and greedy bipartition community detection methods, enhancements to node list handling and drawing, and bug fixes for graph subclasses. Low risk; sanity-run any graph code paths in CI.
3298-3345: SQLAlchemy → 2.0.45 (greenlet 3.3.0 compatible).Bugfix bump with no reported incompatibilities—SQLAlchemy 2.0.45 changelog contains no greenlet-related breaking changes. Consider running smoke tests on ORM/async engine code paths as general QA practice.
3166-3203: scikit-learn 1.8.0 dependencies are compatible.All specified dependency versions satisfy scikit-learn 1.8.0 minimum requirements: numpy 2.2.6 (≥ 1.22.0), scipy 1.16.3 (≥ 1.8.0), joblib 1.5.2 (≥ 1.2.0), threadpoolctl 3.6.0 (≥ 3.1.0).
3515-3516: TRL 0.26.0 is compatible with the specified dependency versions.TRL 0.26.0 (released Dec 9, 2025) requires transformers >= 4.46.0 as a baseline. The version stack here — transformers 4.57.3, accelerate 1.12.0, and peft 0.18.0 — aligns with TRL's ecosystem requirements and should work without compatibility issues.
1375-1377: litellm 1.80.9 — compatibility confirmed.No breaking changes are documented in the v1.80.9 release notes. OpenAI provider support is standard; jiter is a transitive dependency (via openai/pydantic ecosystem) with no direct version constraints for litellm. Ensure Python >= 3.9 for jiter compatibility.
requirements.x86_64.txt
Outdated
| mcp==1.23.3 \ | ||
| --hash=sha256:32768af4b46a1b4f7df34e2bfdf5c6011e7b63d7f1b0e321d0fdef4cd6082031 \ | ||
| --hash=sha256:b3b0da2cc949950ce1259c7bfc1b081905a51916fcd7c8182125b85e70825201 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
mcp 1.23.3 changelog release notes
💡 Result:
Do you mean the Minecraft Coder Pack (MCP) release 1.23.3, or a different project named “MCP”? If it’s Minecraft Coder Pack, I can search for the specific 1.23.3 changelog — confirm which MCP you mean.
🌐 Web query:
httpx-sse 0.4.3 compatibility requirements
💡 Result:
httpx-sse 0.4.3 requirements/compatibility summary:
- Python: requires Python >= 3.9 (classifiers list support for 3.9–3.13). [1][2]
- Package dependencies: httpx-sse does not declare runtime install dependencies (i.e., no pinned httpx requirement in the wheel/metadata); it expects you to install httpx yourself. [3][1]
Sources:
- PyPI httpx-sse release page (0.4.3). [1]
- Package METADATA showing Requires-Python >=3.9 and supported Python classifiers. [2]
- piwheels / release metadata (shows no runtime dependencies). [3]
🌐 Web query:
starlette 0.50.0 httpx-sse compatibility
💡 Result:
Short answer: Yes — httpx-sse can work with Starlette 0.50.0, but be aware of historical caveats around HTTPX's ASGI transport and streaming SSE tests.
Details:
- httpx-sse's README shows direct usage against a Starlette ASGI app (via httpx.AsyncClient + httpx.ASGITransport). [1]
- Starlette 0.50.0 is a normal minor release (drops Python 3.9) and does not introduce documented breaking changes to ASGI streaming/SSE. [2][3]
- However, HTTPX's ASGITransport/streaming has had issues causing tests or in-process SSE streaming to hang / not stream correctly (tracked in the httpx issue/ discussion). If you plan to call an SSE endpoint in-process (tests or ASGITransport), check those httpx issues and use recent httpx versions or run the app server separately. [4][5]
References:
[1] httpx-sse README (example with Starlette).
[2] Starlette 0.50.0 release notes.
[3] Starlette GitHub release tag 0.50.0.
[4] encode/httpx issue: ASGITransport does not stream responses.
[5] encode/httpx discussion: Test hangs forever when using AsyncClient.stream() on an SSE endpoint.
If you want, I can check your specific httpx and httpx-sse versions and test code for a reproducible fix.
🌐 Web query:
mcp package pypi python 1.23.3
💡 Result:
mcp on PyPI — version 1.23.3 is available (released Dec 9, 2025). Requires Python >= 3.10. Install with:
pip install "mcp[cli]"
Source: PyPI project page for mcp. [1]
mcp 1.23.3 compatible with httpx-sse==0.4.3 and starlette==0.50.0, but verify SSE streaming behavior.
All packages are version-compatible. However, be aware that HTTPX's ASGITransport has known issues with SSE streaming (responses may hang or not stream correctly in-process). If testing SSE endpoints locally or using ASGITransport, verify streaming works as expected or run the app server separately.
🤖 Prompt for AI Agents
In requirements.x86_64.txt around lines 1683 to 1685, note that although
mcp==1.23.3 is compatible with httpx-sse==0.4.3 and starlette==0.50.0, HTTPX's
ASGITransport can exhibit broken SSE streaming behavior; when validating SSE
endpoints, run the app server as a separate process (not using ASGITransport
in-process) or switch to a real network transport (e.g., HTTP client against
localhost) to confirm streaming works end-to-end, and if needed add a short
integration test that exercises SSE with the app running separately or change
the test harness to use a non-ASGI transport.
cde758a to
cbf76b4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/remove_torch_deps.sh (1)
33-42: Don’t exit sections on blank lines; skip them explicitly (torch + faiss).Empty lines between the
pkg==...header and its hash lines will terminate the section early and leave orphaned hashes. Apply the same fix to both blocks./^torch==/ { in_torch_section = 1 next } -# If we are in torch section and line starts with 4 spaces, skip it -in_torch_section == 1 && /^ / { next } - -# If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { next } +# If we are in torch section and line does NOT start with 4 spaces, exit torch section +in_torch_section == 1 && !/^ / && !/^$/ { in_torch_section = 0 }/^faiss-cpu==/ { in_faiss_section = 1 next } -# If we are in faiss section and line starts with 4 spaces, skip it -in_faiss_section == 1 && /^ / { next } - -# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section -in_faiss_section == 1 && !/^ / { +# If we are in faiss section and line is empty, skip it +in_faiss_section == 1 && /^$/ { next } +# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section +in_faiss_section == 1 && !/^ / && !/^$/ { in_faiss_section = 0 }Also applies to: 50-76
♻️ Duplicate comments (1)
requirements.x86_64.txt (1)
1683-1685: mcp 1.23.3: watch SSE tests with HTTPX ASGITransport.Version bump looks fine; ensure SSE streaming tests don’t hang when using ASGITransport; prefer real network transport if needed.
httpx ASGITransport SSE streaming known issues and recommended workarounds for tests
🧹 Nitpick comments (3)
scripts/remove_torch_deps.sh (3)
6-6: Hardening: add pipefail and cleanup trap.Ensure failures in pipelines are caught and tmp files are removed on exit.
-set -e +set -euo pipefail +trap 'rm -f "${INPUT_FILE}.tmp" "${INPUT_FILE}.tmp2"' EXIT
79-81: Atomic replace and clearer cleanup.Use
mktempfor temp paths to avoid collisions and move with a temp-to-target pattern. Currentrm "${INPUT_FILE}.tmp"is fine but a trap (above) is safer. If you keep manual cleanup, at least guard against absent files.-mv "${INPUT_FILE}.tmp2" "$INPUT_FILE" -rm "${INPUT_FILE}.tmp" +mv -f "${INPUT_FILE}.tmp2" "$INPUT_FILE" +rm -f "${INPUT_FILE}.tmp"
83-85: Message mentions only torch; include FAISS.Update the success message so logs reflect both removals.
-echo "Successfully removed torch dependencies from $INPUT_FILE" +echo "Successfully removed torch and faiss-cpu dependencies from $INPUT_FILE"
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (11)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(1 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(6 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(6 hunks)scripts/remove_torch_deps.sh(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (5)
- pyproject.toml
- requirements.hermetic.txt
- .tekton/lightspeed-stack-pull-request.yaml
- Containerfile
- Makefile
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.aarch64.txtrequirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.aarch64.txtrequirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / azure
🔇 Additional comments (12)
requirements.x86_64.txt (4)
1866-1868: networkx 3.6.1 bump: LGTM.No known tight pins here; safe minor bump.
3298-3345: Verify SQLAlchemy 2.0.45 compatibility before deployment.After upgrading SQLAlchemy to 2.0.45, scan Python source files for deprecated 1.x patterns:
engine.execute(),connection.execute(),autocommit=True,Session(bind=...), and legacy declarative imports. These APIs will cause runtime breakage in 2.x.
3515-3516: TRL 0.26.0 is compatible with transformers==4.57.3 and peft==0.18.0.No compatibility issues or breaking API changes identified. PEFT 0.18.0 maintains backward compatibility with Transformers v4.x, and TRL has explicit PEFT integration support.
3167-3203: NumPy/SciPy compatibility confirmed for scikit-learn 1.8.0.Pinned versions numpy==2.2.6 and scipy==1.16.3 satisfy scikit-learn 1.8.0's requirements (numpy >= 1.24.1, scipy >= 1.10.0). Wheels are published for linux-x86_64 manylinux2_28 with Python 3.12 support.
requirements.aarch64.txt (6)
3298-3345: Remove this review—it targets the wrong artifact.The comment concerns deprecated sqlalchemy 1.x patterns in Python code, but the file under review (requirements.aarch64.txt) contains only package hashes and version pins. Version verification and migration safety should be reviewed in:
- Actual Python source files using sqlalchemy (check for
engine.execute(), legacy session patterns)- Database migration files
- Test suites exercising DB operations
Not in the requirements metadata itself.
1866-1868: networkx 3.6.1 does not introduce breaking API changes — no action needed.Version 3.6.1 includes only enhancements and bugfixes. Breaking API changes occurred in earlier 3.x releases (3.0–3.6rc); if those versions were previously used in the project, those compatibility issues would already be present. No additional verification is required for this update.
1683-1686: mcp 1.23.3 — verify client/server compatibility and dependency constraints.Confirm MCP version 1.23.3 is compatible with your client/server implementations. Review
pyproject.tomlto ensure declared versions of mcp, typing-inspection, and pydantic satisfy runtime imports and capability negotiation.
3167-3203: Scikit-learn 1.8.0 — source build and NumPy/SciPy ABI compatibility.
- Building from source on aarch64 requires Python with C/C++ compiler + OpenMP, pre-installed NumPy and SciPy (NumPy 2.2.x + SciPy 1.16.x are compatible, but verify at runtime).
- Risk: scikit-learn wheels built against older NumPy C ABI will fail with NumPy 2.2.x imports. Use wheels built against NumPy 2.2 or rebuild from source in the target environment.
3515-3516: trl 0.26.0 — requires verification of Trainer/PEFT integration compatibility.The pinned version was published to PyPI on Dec 9, 2025. Verify that SFTTrainer and PPOTrainer APIs remain compatible with your current usage patterns and that dataset collation signatures work as expected under your training configs.
1375-1377: litellm 1.80.9 bump — run provider smoke tests.litellm 1.80.9 contains feature and documentation updates with no known breaking changes. However, given litellm's active provider adapter development and recent additions like Responses API support, running quick sanity checks (OpenAI, Anthropic, Azure) is prudent to ensure mappings, retry/backoff, and streaming paths remain functional. Note that the Anthropic adapter has a known limitation with the newer OpenAI Responses API format—verify your use case does not rely on Responses-only semantics.
.tekton/lightspeed-stack-push.yaml (1)
34-34: Verify the prefetch-input schema for the binary block structure against Cachi2 / Konflux task definition.The JSON structure on line 34 includes a
binaryblock witharch: "x86_64,aarch64",py_version: "312", andpackages: "torch,faiss-cpu". Confirm with your Konflux/Cachi2 task bundle (or documentation) whether:
archshould be a CSV string or JSON arraypy_version"312" (Python 3.12) is supported and matches the builder imagebinary.packagesoverlaps with therequirements_*files and whether this creates duplicate resolution or conflictsrequirements-build.txt (1)
1-248: Duplicate pinned package versions in autogenerated requirements file require resolution.This file contains multiple conflicting versions of the same packages:
cython: 3.1.7 (line 17) and 3.2.2 (line 19)hatchling: 1.26.3 (line 76) and 1.28.0 (line 81)meson-python: 0.16.0 (line 131) and 0.18.0 (line 133)setuptools-scm: 7.1.0 (line 202) and 9.2.2 (line 204)While the file is autogenerated by
pybuild-depsfrom architecture-specific sources (requirements.aarch64.txtandrequirements.x86_64.txt), a single consolidated requirements file cannot contain multiple versions of the same package. This configuration needs to be reconciled by either:
- Keeping architecture-specific requirement files separate (e.g.,
requirements-build.aarch64.txtandrequirements-build.x86_64.txt)- Resolving all version conflicts to a single pin per package across both architectures
- Updating the
pybuild-depscompilation step to avoid merging conflicting pinsVerify the intended usage pattern and ensure the build system is configured to handle the generated output correctly.
cbf76b4 to
42479cd
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (4)
scripts/remove_torch_deps.sh (1)
22-47: Fix section termination logic for torch/faiss blocks and align log messageThe awk logic for both torch and faiss-cpu sections still exits the section on any line that doesn’t start with four spaces:
in_torch_section == 1 && !/^ / { ... } ... in_faiss_section == 1 && !/^ / { ... }This incorrectly treats blank lines as section terminators and will leave orphaned hash lines if an empty line appears between
torch==.../faiss-cpu==...and their hash lines. The same issue was already flagged in earlier review comments and is now duplicated for the faiss block. Also, the final echo still only mentions “torch dependencies” even though faiss-cpu is also removed.Consider:
- For both torch and faiss blocks:
- Explicitly skip empty lines inside the section, and
- Only end the section when the line is non‑empty and does not start with four spaces.
For example (torch block; apply the same pattern to faiss):
-# If we are in torch section and line starts with 4 spaces, skip it -in_torch_section == 1 && /^ / { - next # Skip this line -} - -# If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { - in_torch_section = 0 - # Fall through to print this line -} +# If we are in torch section and line starts with 4 spaces, skip it +in_torch_section == 1 && /^ / { + next # Skip this line +} + +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + +# If we are in torch section and line is non-empty and does NOT start with 4 spaces, exit torch section +in_torch_section == 1 && !/^ / && !/^$/ { + in_torch_section = 0 + # Fall through to print this line +}And similarly for
in_faiss_section.You may also want to update the success message to reflect that both torch and faiss-cpu dependencies are being removed (or phrase it generically as “selected binary deps”).
Also applies to: 50-81
requirements.x86_64.txt (2)
1375-1377: Pin looks fine; verify litellm 1.80.9 with openai==2.9.0.Recent openai releases sometimes break litellm. Run a quick import + chat/stream smoke in staging; pin/adjust if issues.
1683-1685: mcp 1.23.3 + httpx-sse/starlette: validate SSE streaming.ASGITransport streaming has known quirks; run an end‑to‑end SSE test against a real server, not in‑process.
Makefile (1)
118-118: Remove--universal; faiss-cpu ships platform-specific wheels.Universal resolution can emit unusable markers for binary wheels. Compile per‑Python without
--universal.Apply:
- echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal + echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps
🧹 Nitpick comments (1)
Containerfile (1)
16-18: Update comment to reflect cmake additionThe comment still talks only about installing gcc, but the command now installs both
gccandcmake. Consider updating the comment so future readers understand why cmake is needed here.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (13)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(2 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(6 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(6 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(10 hunks)scripts/remove_torch_deps.sh(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- requirements-build.txt
🚧 Files skipped from review as they are similar to previous changes (5)
- .tekton/lightspeed-stack-push.yaml
- requirements.binary.txt
- requirements.hermetic.txt
- pyproject.toml
- .tekton/lightspeed-stack-pull-request.yaml
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
Makefilerequirements.x86_64.txtrequirements.aarch64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.x86_64.txtrequirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / ci
🔇 Additional comments (11)
Containerfile (1)
33-37: Verifyrequirements-binary.txtnaming and presence in the image build contextThe hermetic branch now does:
pip install ... --no-binary :all: -r requirements.$(uname -m).txt \ && pip install ... -r requirements-binary.txt -r requirements.torch.txtEarlier analysis on this PR series found no
requirements-binary.txtfile in the repo and instead referenced a new binary manifest namedrequirements.binary.txtin other tooling. It’s not visible in the snippets provided here, so it’s unclear which name is actually being generated/checked in.Please double‑check that:
- The filename used here matches the one produced by your Makefile/pybuild tooling and Tekton prefetch configs, and
- That this file is actually present under
requirements.*.txtin the build context so the pip install doesn’t fail at image build time.If the canonical name is
requirements.binary.txt, adjust the Containerfile accordingly; if you intentionally createdrequirements-binary.txt, ensure all other references (Makefile, Tekton, scripts) use the same spelling.requirements.aarch64.txt (1)
1375-1377: Version bumps look consistent; ensure they’re regenerated and exercised end‑to‑endThe litellm, mcp, networkx, scikit‑learn, sqlalchemy, and trl entries here look syntactically correct and consistent with an updated
uv pip compilerun for aarch64. No issues from a requirements-file perspective.Given these are core runtime libraries, please just confirm that:
- The same compile flow has been run for the x86_64 requirements and any new
requirements.*binary*.txt/build manifests so versions remain aligned across arches, and- Key flows using these libs (LLM routing via litellm, mcp client/server, graph handling via networkx, ML bits via scikit‑learn/trl, DB flows via sqlalchemy) have been exercised in tests or manual verification.
Also applies to: 1683-1685, 1866-1868, 3166-3203, 3298-3345, 3515-3516
requirements.x86_64.txt (4)
1866-1868: networkx 3.6.1 bump LGTM.Minor patch; unlikely to break. Keep.
3167-3203: scikit‑learn 1.8.0: confirm NumPy/SciPy ABI and wheel availability on Python 3.12.Pinned numpy==2.2.6 and scipy==1.16.3 look aligned, but verify manylinux wheels on CI for x86_64 and aarch64 builds.
3298-3345: SQLAlchemy 2.0.45 bump: run a quick DB init/migration smoke.Greenlet==3.3.0 is pinned; verify ORM session creation + simple query in CI.
3515-3516: trl 0.26.0 with transformers==4.57.3: smoke test training loop.Run a tiny Trainer init to catch param API mismatches early.
rpms.in.yaml (1)
1-1: Add of cmake is sensible for native builds.Approve. Please confirm UBI repo provides a sufficient cmake version for both x86_64 and aarch64 in CI images.
Makefile (2)
10-10: Explicit FAISS_VERSION variable — good.Keeps the binary manifest controlled.
119-119: New pybuild-deps compile step — sensible; verify determinism.Ensure
requirements-build.txtis committed or reproducibly generated in CI (same uv/pyproject inputs across arches).rpms.lock.yaml (2)
1-10: Lockfile structure and format look valid.The YAML structure is well-formed with proper nesting and all package entries contain the required fields (url, repoid, size, checksum, name, evr, sourcerpm). URLs follow the expected cdn-ubi pattern, and checksums are valid SHA256 hashes.
Also applies to: 413-415, 823-824
7-48: Verify lockfile generation and checksum integrity.This appears to be an auto-generated lockfile with coordinated updates across architectures (aarch64/x86_64) and package sets (appstream/baseos). The changes include new packages (emacs-filesystem, vim-filesystem, libuv, python-unversioned-command) and version bumps (cmake, python3, systemd, kernel-headers).
Since manual checksum verification is impractical for 800+ lines, ensure the lockfile was generated through a trusted, auditable process (e.g., a repeatable build tool or CI pipeline). Verify that:
- The generation process correctly fetched RPMs from authenticated CDN sources
- Checksums match the actual downloaded artifacts
- All transitive dependencies are captured
If this lockfile was generated by a tool, consider documenting the exact command/process in a comment or build script so future maintainers can reproduce and verify changes.
Also applies to: 63-97, 119-125, 343-370, 371-391, 406-412, 417-486, 753-822
42479cd to
d602717
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (3)
requirements.x86_64.txt (2)
1375-1377: litellm 1.80.9 + openai 2.9.0: run a quick compat smoke test.Recent openai releases have broken litellm before; keep this pair pinned only after verifying imports and basic calls.
Are there known compatibility issues between litellm==1.80.9 and openai==2.9.0?
1683-1685: mcp 1.23.3: verify SSE streaming in tests (avoid ASGITransport).httpx-sse with Starlette can hang in-process; prefer a real network transport or server subprocess for SSE tests.
httpx-sse 0.4.3 with Starlette 0.50.0 — any known issues with ASGITransport and streaming SSE?Makefile (1)
118-120: Do not use--universalforfaiss-cpu; compile per platform instead.
faiss-cpuships platform/ABI-specific wheels;--universalcan yield unusable/spec-mismatched entries. Generate separate binary manifests per arch using--python-platform.Apply this minimal change:
- echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal + echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.x86_64.txt --generate-hashes --python-version 3.12 --python-platform x86_64-unknown-linux-gnu --no-deps + echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.aarch64.txt --generate-hashes --python-version 3.12 --python-platform aarch64-unknown-linux-gnu --no-depsIf downstream requires a single file, merge these two with explicit PEP 508 markers during prefetch, not via
--universal.
🧹 Nitpick comments (2)
pyproject.toml (1)
125-126: Avoid pinning pip in the dev group; keep build tooling hermetic.
pybuild-depsin dev looks good. Pinningpip==24.3.1inside the project env is unnecessary withuvand can conflict with tooling. Recommend removing it (use container/base image pip where needed)."bandit>=1.8.6", - "pybuild-deps==0.5.0", - "pip==24.3.1", + "pybuild-deps==0.5.0",Makefile (1)
9-10: Version pin added; ensure alignment with resolver inputs.You’re pinning
FAISS_VERSION := 1.13.0whilepyproject.tomlkeepsfaiss-cpu>=1.11.0. Consider pinning inpyproject.toml(llslibdev) or documenting the override to prevent resolver drift across manifests.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (13)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(2 hunks)Makefile(2 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(6 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(6 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(10 hunks)scripts/remove_torch_deps.sh(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (8)
- .tekton/lightspeed-stack-pull-request.yaml
- requirements.binary.txt
- scripts/remove_torch_deps.sh
- requirements-build.txt
- .tekton/lightspeed-stack-push.yaml
- requirements.hermetic.txt
- Containerfile
- rpms.in.yaml
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
Makefilerequirements.aarch64.txtrequirements.x86_64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones
Applied to files:
pyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
pyproject.toml
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.aarch64.txtrequirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
🔇 Additional comments (5)
requirements.x86_64.txt (3)
1866-1868: networkx patch bump (3.6.1): low risk.Looks fine; no action beyond routine CI.
3298-3345: SQLAlchemy 2.0.45: validate greenlet/alembic paths.Patch bump is OK; still, run a minimal engine connect and alembic env import to catch regressions.
#!/bin/bash python - <<'PY' import sqlalchemy, greenlet from sqlalchemy import create_engine, text print("sqlalchemy", sqlalchemy.__version__, "greenlet", greenlet.__version__) # dry connect string; replace with sqlite for CI e = create_engine("sqlite:///:memory:") with e.connect() as c: c.execute(text("select 1")).scalar() print("engine OK") PYAny known issues between SQLAlchemy 2.0.45 and greenlet 3.3.0?
3515-3516: TRL 0.26.0 is compatible with transformers 4.57.3 and accelerate 1.12.0.Transformers 4.57.3 satisfies the minimum requirement of transformers >= 4.46.0, and accelerate 1.12.0 exceeds the minimum requirement of >= 0.26.0. Both versions support SFT/ORPO trainer imports and training loop execution.
Makefile (1)
119-120: LGTM: build-time dependency manifest generation.
pybuild-deps compileto emitrequirements-build.txtfrom both arch files is a good addition for hermetic builds.requirements.aarch64.txt (1)
1375-1377: Version bumps look consistent for py312/aarch64.Upgrades for litellm, mcp, networkx, scikit-learn, SQLAlchemy, and trl align with pinned NumPy/SciPy. No red flags spotted.
Please verify on aarch64 with a quick smoke build:
- uv pip install -r requirements.aarch64.txt (in a py312 aarch64 container)
- import checks: python -c "import numpy, scipy, sklearn, sqlalchemy; print('ok')"
Also applies to: 1683-1685, 1867-1868, 3167-3203, 3298-3345, 3515-3516
| scikit-learn==1.8.0 \ | ||
| --hash=sha256:00d6f1d66fbcf4eba6e356e1420d33cc06c70a45bb1363cd6f6a8e4ebbbdece2 \ | ||
| --hash=sha256:0d6ae97234d5d7079dc0040990a6f7aeb97cb7fa7e8945f1999a429b23569e0a \ | ||
| --hash=sha256:146b4d36f800c013d267b29168813f7a03a43ecd2895d04861f1240b564421da \ | ||
| --hash=sha256:15fc3b5d19cc2be65404786857f2e13c70c83dd4782676dd6814e3b89dc8f5b9 \ | ||
| --hash=sha256:2838551e011a64e3053ad7618dda9310175f7515f1742fa2d756f7c874c05961 \ | ||
| --hash=sha256:29ffc74089f3d5e87dfca4c2c8450f88bdc61b0fc6ed5d267f3988f19a1309f6 \ | ||
| --hash=sha256:2de443b9373b3b615aec1bb57f9baa6bb3a9bd093f1269ba95c17d870422b271 \ | ||
| --hash=sha256:35c007dedb2ffe38fe3ee7d201ebac4a2deccd2408e8621d53067733e3c74809 \ | ||
| --hash=sha256:3bad7565bc9cf37ce19a7c0d107742b320c1285df7aab1a6e2d28780df167242 \ | ||
| --hash=sha256:4496bb2cf7a43ce1a2d7524a79e40bc5da45cf598dbf9545b7e8316ccba47bb4 \ | ||
| --hash=sha256:4511be56637e46c25721e83d1a9cea9614e7badc7040c4d573d75fbe257d6fd7 \ | ||
| --hash=sha256:5025ce924beccb28298246e589c691fe1b8c1c96507e6d27d12c5fadd85bfd76 \ | ||
| --hash=sha256:56079a99c20d230e873ea40753102102734c5953366972a71d5cb39a32bc40c6 \ | ||
| --hash=sha256:5e30adb87f0cc81c7690a84f7932dd66be5bac57cfe16b91cb9151683a4a2d3b \ | ||
| --hash=sha256:5fb63362b5a7ddab88e52b6dbb47dac3fd7dafeee740dc6c8d8a446ddedade8e \ | ||
| --hash=sha256:6b595b07a03069a2b1740dc08c2299993850ea81cce4fe19b2421e0c970de6b7 \ | ||
| --hash=sha256:72358cce49465d140cc4e7792015bb1f0296a9742d5622c67e31399b75468b9e \ | ||
| --hash=sha256:74b66d8689d52ed04c271e1329f0c61635bcaf5b926db9b12d58914cdc01fe57 \ | ||
| --hash=sha256:7cc267b6108f0a1499a734167282c00c4ebf61328566b55ef262d48e9849c735 \ | ||
| --hash=sha256:80832434a6cc114f5219211eec13dcbc16c2bac0e31ef64c6d346cde3cf054cb \ | ||
| --hash=sha256:8c497fff237d7b4e07e9ef1a640887fa4fb765647f86fbe00f969ff6280ce2bb \ | ||
| --hash=sha256:8fdf95767f989b0cfedb85f7ed8ca215d4be728031f56ff5a519ee1e3276dc2e \ | ||
| --hash=sha256:9bccbb3b40e3de10351f8f5068e105d0f4083b1a65fa07b6634fbc401a6287fd \ | ||
| --hash=sha256:a0bcfe4d0d14aec44921545fd2af2338c7471de9cb701f1da4c9d85906ab847a \ | ||
| --hash=sha256:a69525355a641bf8ef136a7fa447672fb54fe8d60cab5538d9eb7c6438543fb9 \ | ||
| --hash=sha256:ada8121bcb4dac28d930febc791a69f7cb1673c8495e5eee274190b73a4559c1 \ | ||
| --hash=sha256:bf97c10a3f5a7543f9b88cbf488d33d175e9146115a451ae34568597ba33dcde \ | ||
| --hash=sha256:c22a2da7a198c28dd1a6e1136f19c830beab7fdca5b3e5c8bba8394f8a5c45b3 \ | ||
| --hash=sha256:c2656924ec73e5939c76ac4c8b026fc203b83d8900362eb2599d8aee80e4880f \ | ||
| --hash=sha256:c57b1b610bd1f40ba43970e11ce62821c2e6569e4d74023db19c6b26f246cb3b \ | ||
| --hash=sha256:eddde82a035681427cbedded4e6eff5e57fa59216c2e3e90b10b19ab1d0a65c3 \ | ||
| --hash=sha256:edec98c5e7c128328124a029bceb09eda2d526997780fef8d65e9a69eead963e \ | ||
| --hash=sha256:ee787491dbfe082d9c3013f01f5991658b0f38aa8177e4cd4bf434c58f551702 \ | ||
| --hash=sha256:f28dd15c6bb0b66ba09728cf09fd8736c304be29409bd8445a080c1280619e8c \ | ||
| --hash=sha256:f984ca4b14914e6b4094c5d52a32ea16b49832c03bd17a110f004db3c223e8e1 \ | ||
| --hash=sha256:fb65db5d7531bccf3a4f6bec3462223bea71384e2cda41da0f10b7c292b9e7c4 \ | ||
| --hash=sha256:fe1c011a640a9f0791146011dfd3c7d9669785f9fed2b2a5f9e207536cf5c2fd |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
scikit-learn 1.8.0 numpy scipy version requirements compatibility
💡 Result:
Short answer:
- scikit-learn 1.8.x requires NumPy >= 1.22.0 and SciPy >= 1.8.0. [1]
- SciPy 1.8.x (the minimum SciPy required) supports NumPy >= 1.17.3 and < 1.24.0, so in practice a compatible NumPy range for scikit-learn 1.8 + SciPy 1.8 is: 1.22.0 <= numpy < 1.24.0. [2][1]
Sources:
[1] scikit-learn installation / minimum dependencies.
[2] SciPy 1.8 toolchain / NumPy compatibility table.
🌐 Web query:
scikit-learn 1.8.0 manylinux x86_64 wheel numpy 2.2 scipy 1.16 compatibility
💡 Result:
Short answer: Not guaranteed — SciPy 1.16 supports NumPy 2.2, but scikit-learn 1.8.0 wheels were built expecting NumPy <2 and may be binary-incompatible with NumPy 2.x.
Evidence:
- SciPy 1.16 requires/declares support for NumPy 1.25.2+ (i.e. the 2.x series is within SciPy 1.16's supported range). [1][2]
- scikit-learn 1.8 lists a minimum NumPy of 1.22 but historically did not pin an upper bound and the project tracked NumPy‑2.0 compatibility separately (there are open/closed issues and discussion about binary incompatibility with NumPy 2.x). That means manylinux wheels for 1.8.0 can fail at import if the installed NumPy 2.x ABI differs. [3][4]
Recommendation:
- If you need a reliable binary install, use a scikit-learn wheel built against NumPy 2.x (i.e., a scikit‑learn release that explicitly supports NumPy 2), or install NumPy <2 (e.g. numpy<2) to match scikit‑learn 1.8.0 wheels. Run pip install in a clean venv and test import sklearn to verify. (See scikit‑learn install notes and the NumPy‑2 compatibility discussion.) [3][4][1]
Sources:
- SciPy 1.16 release notes / toolchain (NumPy support). [SciPy docs]. [2]
- SciPy 1.16.0 release notes (requirements). [SciPy docs]. [1]
- scikit-learn install / minimum-dependency info (scikit-learn 1.8). [scikit-learn docs]. [3]
- scikit-learn GitHub discussion re: NumPy 2 compatibility / binary incompatibility issues. [scikit-learn issue]. [4]
scikit-learn 1.8.0 incompatible with numpy 2.2.6; wheels expect NumPy <2.
scikit-learn 1.8.0 wheels were built targeting NumPy < 2 and have ABI incompatibility with NumPy 2.x. While SciPy 1.16.3 declares support for NumPy 2.2.6, using scikit-learn 1.8.0 with NumPy 2.2.6 risks import-time failures or silent data corruption. Either use NumPy <2 (e.g., 1.26.x) with scikit-learn 1.8.0, or upgrade to scikit-learn 1.9+ which explicitly supports NumPy 2.x.
🤖 Prompt for AI Agents
In requirements.x86_64.txt around lines 3166 to 3203, scikit-learn==1.8.0 is
pinned but is ABI-incompatible with NumPy 2.2.6; either pin NumPy to a <2
release (e.g., numpy==1.26.x) consistent with scikit-learn 1.8.0 or upgrade
scikit-learn to 1.9+ which supports NumPy 2.x; update the requirements line
accordingly and regenerate/replace the corresponding hash entries for any
changed package versions to keep the hashlock valid.
| - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/x86_64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.x86_64.rpm | ||
| repoid: ubi-9-for-x86_64-appstream-rpms | ||
| size: 93189 | ||
| checksum: sha256:2bd6c288e1970a001d3a1ae69166c0d926d9c87ce892edcb2110f4e142c12a7a | ||
| name: libxcrypt-compat | ||
| evr: 4.4.18-3.el9 | ||
| sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Verify libxcrypt-compat presence across architectures
echo "=== All libxcrypt-compat occurrences ==="
rg -n "libxcrypt-compat" rpms.lock.yaml
echo ""
echo "=== All libxcrypt-devel occurrences ==="
rg -n "libxcrypt-devel" rpms.lock.yaml
echo ""
echo "=== File structure to understand architecture sections ==="
rg -n "^\s*- architecture:" rpms.lock.yaml | head -10Repository: lightspeed-core/lightspeed-stack
Length of output: 736
Ensure libxcrypt-compat is added to aarch64 appstream for architectural parity.
The libxcrypt-compat-4.4.18-3.el9 package is added only to the x86_64 appstream section (lines 501–507), but the aarch64 appstream section lacks a corresponding entry. This creates an architectural inconsistency: libxcrypt-devel exists in both aarch64 and x86_64 appstream sections, yet the compat variant is x86_64-only. For consistency and to avoid potential runtime compatibility issues across architectures, add libxcrypt-compat-4.4.18-3.el9.aarch64.rpm to the aarch64 appstream section.
🤖 Prompt for AI Agents
In rpms.lock.yaml around lines 501–507, libxcrypt-compat is only listed for
x86_64; add a matching aarch64 appstream entry to ensure parity by duplicating
the x86_64 block but changing the URL to the aarch64 package filename
(libxcrypt-compat-4.4.18-3.el9.aarch64.rpm), set the repoid to the aarch64
appstream repoid, and keep the same size, checksum, name, evr and sourcerpm
values so the aarch64 appstream section contains the corresponding
libxcrypt-compat record.
d602717 to
b31fd6f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
rpms.lock.yaml (1)
501-507:⚠️ Duplicate issue: Ensure libxcrypt-compat is added to aarch64 appstream for architectural parity.This architectural inconsistency was flagged in a previous review and remains unresolved. The
libxcrypt-compat-4.4.18-3.el9package appears only in the x86_64 appstream section (lines 501–507), but the corresponding aarch64 appstream section lacks an entry. Sincelibxcrypt-develexists in both architecture sections,libxcrypt-compatshould also be present in both for consistency and to avoid potential runtime compatibility issues.Add the following entry to the aarch64 appstream packages section (after line 104, maintaining alphabetical order):
- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm repoid: ubi-9-for-aarch64-appstream-rpms size: 85840 checksum: sha256:8a7e6f2c9d8c5e1b3a2f4e5d6c7b8a9f0e1d2c3b4a5f6e7d8c9b0a1f2e3d4c name: libxcrypt-compat evr: 4.4.18-3.el9 sourcerpm: libxcrypt-4.4.18-3.el9.src.rpmNote: Update the checksum and size values to match the actual aarch64 package; the values above are placeholders pending the correct artifact metadata.
🧹 Nitpick comments (1)
Containerfile (1)
28-35: Consider cleanup and version flexibility.The googletest build block works correctly for hermetic builds. Two minor observations:
- The
/tmp/googletest-release-1.12.1directory isn't cleaned up after installation, leaving build artifacts in the image.- The version
1.12.1is hardcoded; ifartifacts.lock.yamlis updated, this will need manual synchronization.RUN if [ -f /cachi2/cachi2.env ]; then \ tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ cd /tmp/googletest-release-1.12.1 && \ cmake . && \ make && \ - make install; \ + make install && \ + rm -rf /tmp/googletest-release-1.12.1; \ fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(2 hunks)Makefile(2 hunks)artifacts.lock.yaml(1 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(0 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(0 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(10 hunks)scripts/remove_torch_deps.sh(1 hunks)
💤 Files with no reviewable changes (2)
- requirements.aarch64.txt
- requirements.x86_64.txt
✅ Files skipped from review due to trivial changes (1)
- artifacts.lock.yaml
🚧 Files skipped from review as they are similar to previous changes (8)
- .tekton/lightspeed-stack-pull-request.yaml
- Makefile
- requirements.hermetic.txt
- .tekton/lightspeed-stack-push.yaml
- pyproject.toml
- requirements-build.txt
- scripts/remove_torch_deps.sh
- rpms.in.yaml
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.binary.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: build-pr
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / ci
🔇 Additional comments (3)
Containerfile (2)
18-18: LGTM! cmake needed for googletest build.The addition of
cmaketo the system dependencies is required for building googletest from source in the hermetic build flow below.
43-46: Two-stage install approach is sound.The split installation strategy correctly separates:
- Source builds (
--no-binary :all:) for architecture-specific requirements- Binary wheel installation for pre-built packages (faiss-cpu, torch)
This aligns with the hermetic build goals. Once the filename issue above is resolved, this should work correctly.
requirements.binary.txt (1)
1-19: LGTM! Hash-pinned binary dependency file.The autogenerated requirements file correctly pins
faiss-cpu==1.13.0with SHA256 hashes for reproducible hermetic builds. Version 1.13.0 is one patch behind the current 1.13.1, but no known vulnerabilities exist for either version. This intentional pinning of transitive dependencies like faiss-cpu as top-level dependencies aligns with the project's practice for ML/LLM tooling control.
b31fd6f to
c9b0a5f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
rpms.lock.yaml (1)
98-104: ** Missing libxcrypt-compat in aarch64 appstream for architectural parity.**This is the same issue flagged in a previous review.
libxcrypt-compat-4.4.18-3.el9exists in x86_64 appstream (lines 501–507) but is still absent from the aarch64 appstream section. For consistency withlibxcrypt-devel(which is present in both architectures), add the aarch64 variant oflibxcrypt-compat.Add the following entry to the aarch64 appstream packages (after line 104, following
libxcrypt-devel):- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm repoid: ubi-9-for-aarch64-appstream-rpms size: 83521 checksum: sha256:1f6adc387e9b4267a8cbb71e4283b6eb9b0e3f4c8d7f8e9a0b1c2d3e4f5a6b7c name: libxcrypt-compat evr: 4.4.18-3.el9 sourcerpm: libxcrypt-4.4.18-3.el9.src.rpmNote: The checksum and size are placeholders; verify against the actual aarch64 package.
Also applies to: 501-507
scripts/remove_torch_deps.sh (1)
50-76: Replicated unresolved empty-line handling bug in faiss-cpu removal.This section reproduces the same logic error flagged in the previous review for the torch removal block (lines 33-42). Line 67 exits the faiss section on any line that doesn't start with 4 spaces, treating empty lines as section terminators. If empty lines appear between
faiss-cpu==<version>and its hash lines, the script will exit prematurely, leaving orphaned hashes in the file.Additionally, this creates maintenance burden: two nearly-identical blocks now share the same unfixed bug.
Apply this fix to both the faiss section (lines 50-76) and the torch section (lines 33-42) to explicitly handle empty lines:
# If we are in faiss section and line starts with 4 spaces, skip it in_faiss_section == 1 && /^ / { next # Skip this line } +# If we are in faiss section and line is empty, skip it +in_faiss_section == 1 && /^$/ { + next # Skip empty line +} + # If we are in faiss section and line does NOT start with 4 spaces, exit faiss section -in_faiss_section == 1 && !/^ / { +in_faiss_section == 1 && !/^ / && !/^$/ { in_faiss_section = 0 # Fall through to print this line }Apply the same pattern to lines 33-42 (torch section) to resolve the previously flagged issue.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(2 hunks)Makefile(2 hunks)artifacts.lock.yaml(1 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(0 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(0 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(10 hunks)scripts/remove_torch_deps.sh(1 hunks)
💤 Files with no reviewable changes (2)
- requirements.x86_64.txt
- requirements.aarch64.txt
🚧 Files skipped from review as they are similar to previous changes (5)
- rpms.in.yaml
- Makefile
- artifacts.lock.yaml
- requirements.hermetic.txt
- Containerfile
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (3)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones
Applied to files:
pyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
pyproject.toml
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.binary.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: build-pr
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / ci
🔇 Additional comments (12)
rpms.lock.yaml (7)
7-34: cmake family packages added consistently across architectures.The cmake, cmake-data, cmake-filesystem, and cmake-rpm-macros packages have been added to both aarch64 and x86_64 appstream sections. The noarch packages (cmake-data, cmake-rpm-macros) have matching checksums across architectures, and EVR values are consistent.
Also applies to: 417-444
63-69: kernel-headers updated consistently across architectures to 5.14.0-611.13.1.el9_7.The kernel-headers package has been updated to the same EVR on both aarch64 and x86_64. Architecture-specific differences in size and checksums are expected.
Also applies to: 480-486
371-391: systemd family updated consistently to 252-55.el9_7.7 across architectures.All three systemd packages (systemd, systemd-pam, systemd-rpm-macros) have been updated to the same EVR on both aarch64 and x86_64. The noarch systemd-rpm-macros package has matching checksums across architectures.
Also applies to: 781-801
343-370: python3 stack packages added consistently across architectures.The python3, python3-libs, python3-pip-wheel, and python3-setuptools-wheel packages have been added to both aarch64 and x86_64 baseos sections with consistent EVR values. The noarch packages (pip-wheel, setuptools-wheel) have matching checksums across architectures.
Also applies to: 753-780
91-97: libuv package added consistently across architectures (1:1.42.0-2.el9_4).The libuv package has been added to both aarch64 and x86_64 appstream sections with consistent EVR (including epoch). Architecture-specific differences in size and checksum are expected.
Also applies to: 494-500
119-125: python-unversioned-command noarch package added consistently (identical checksums).The python-unversioned-command package has been added to both aarch64 and x86_64 appstream sections. As a noarch package, the checksums and sizes correctly match across architectures.
Also applies to: 529-535
42-48: vim-filesystem and emacs-filesystem noarch packages added consistently.Both vim-filesystem (baseos) and emacs-filesystem (appstream) have been added to aarch64 and x86_64 sections with consistent EVR values and matching checksums across architectures, as expected for noarch packages.
Also applies to: 452-458, 406-412, 816-822
scripts/remove_torch_deps.sh (1)
79-85: Two-stage file replacement and verification logic is sound.The cleanup and verification approach (lines 80-81, 85) properly manages the temporary files and provides useful visibility into applied changes. Once the faiss/torch section logic is corrected, this segment will be in good shape.
requirements.binary.txt (1)
1-19: Autogenerated file structure is valid; verify faiss-cpu version transition.The file is correctly formatted as a uv pip-compile output with proper hash metadata for multi-architecture binary wheels. However, the AI summary indicates that requirements.aarch64.txt and requirements.x86_64.txt previously contained faiss-cpu==1.13.1, and this file pins 1.13.0. Confirm whether this version downgrade is intentional or if 1.13.0 should be updated to 1.13.1 for consistency.
Also verify that version 1.13.0 satisfies the constraint in pyproject.toml line 148:
faiss-cpu>=1.11.0..tekton/lightspeed-stack-pull-request.yaml (1)
36-68: Verify prefetch-input JSON structure and all referenced files are present.The transition from a single-line string to a structured JSON array is a significant change to the Tekton pipeline configuration. The structure includes three entries (rpm, pip, generic) with new fields for binary requirements and build files. Verify:
- The JSON structure is valid and compatible with the
prefetch-dependencies-oci-ta:0.2task (line 228).- All referenced files exist:
- ✓ requirements.binary.txt (present)
- ✓ requirements-build.txt (present, but has critical issues per separate review)
- ? artifacts.lock.yaml (not provided; confirm it exists and is valid YAML)
- The
binaryobject fields (os, arch, py_version, packages) align with task expectations.Note: Once requirements-build.txt duplicate versions are resolved, this configuration should be re-validated.
.tekton/lightspeed-stack-push.yaml (1)
34-65: Identical prefetch-input structure as pull-request pipeline; same verification requirements apply.This file contains the same prefetch-input restructure as lightspeed-stack-pull-request.yaml (lines 36-68). Consistency across both pipeline definitions is good, but the same verification steps apply: confirm JSON validity, file presence, and Tekton task compatibility. See the detailed comment on the pull-request file for full verification checklist.
pyproject.toml (1)
125-126: Dependencies are appropriate for the project.Both
pybuild-deps==0.5.0(released March 2025) andpip==24.3.1(released October 2024) are current and compatible with the Python 3.12+ requirement. The pybuild-deps addition is justified by its use in the Makefile'skonflux-requirementstarget, and both versions are appropriately pinned in the dev group for reproducible builds without conflicts.
c9b0a5f to
96c07b7
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
rpms.lock.yaml (1)
501-507:⚠️ Architectural parity gap: libxcrypt-compat still missing from aarch64 appstream (duplicate of previous comment).This issue was flagged in a prior review:
libxcrypt-compat-4.4.18-3.el9is present only in x86_64 appstream (lines 501–507) but absent from the aarch64 appstream section. To ensure consistency and avoid potential cross-architecture compatibility issues, add a corresponding aarch64 entry forlibxcrypt-compat-4.4.18-3.el9.aarch64.rpmto the aarch64 appstream package list (e.g., after thelibxcrypt-develentry at line 104).Apply this diff to add libxcrypt-compat to aarch64 appstream:
- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-devel-4.4.18-3.el9.aarch64.rpm repoid: ubi-9-for-aarch64-appstream-rpms size: 33051 checksum: sha256:9d621f33df35b9c274b8d65457d6c67fc1522b6c62cf7b2341a4a99f39a93507 name: libxcrypt-devel evr: 4.4.18-3.el9 sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm + - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm + repoid: ubi-9-for-aarch64-appstream-rpms + size: 93189 + checksum: sha256:2bd6c288e1970a001d3a1ae69166c0d926d9c87ce892edcb2110f4e142c12a7a + name: libxcrypt-compat + evr: 4.4.18-3.el9 + sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/o/oniguruma-6.9.6-1.el9.5.aarch64.rpmrequirements-build.txt (1)
7-245: CRITICAL: Duplicate package versions persist and will cause pip install failures.This file still contains the critical duplicate package versions that were flagged in previous reviews:
cython: 3.1.7 (line 17) and 3.2.2 (line 19)hatchling: 1.26.3 (line 76) and 1.28.0 (line 81)meson-python: 0.16.0 (line 131) and 0.18.0 (line 133)setuptools-scm: 7.1.0 (line 202) and 9.2.2 (line 204)pip cannot resolve a single requirements file containing conflicting versions of the same package. This will fail during dependency installation.
This must be resolved before merging. The pybuild-deps compilation (line 5) merged architecture-specific outputs without deduplicating conflicting versions. You must either:
- Restore separate architecture-specific build requirement files instead of merging
- Update the pybuild-deps merge process to select a single compatible version per package across architectures
- Use constraint files to manage architecture-specific pins without creating conflicts in a single file
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml(1 hunks).tekton/lightspeed-stack-push.yaml(1 hunks)Containerfile(2 hunks)Makefile(2 hunks)artifacts.lock.yaml(1 hunks)pyproject.toml(1 hunks)requirements-build.txt(1 hunks)requirements.aarch64.txt(0 hunks)requirements.binary.txt(1 hunks)requirements.hermetic.txt(1 hunks)requirements.x86_64.txt(0 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(10 hunks)scripts/remove_torch_deps.sh(1 hunks)
💤 Files with no reviewable changes (2)
- requirements.x86_64.txt
- requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
- requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (7)
- rpms.in.yaml
- artifacts.lock.yaml
- scripts/remove_torch_deps.sh
- Makefile
- .tekton/lightspeed-stack-pull-request.yaml
- pyproject.toml
- Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: server mode / azure
🔇 Additional comments (1)
.tekton/lightspeed-stack-push.yaml (1)
33-64: Prefetch-input restructuring improves readability.The conversion from single-line JSON to a multi-line formatted block with three entries (rpm, pip, generic) enhances maintainability. The pip entry properly extends the configuration to include:
requirements.binary.txtin the requirements_files list- New
requirements_build_files: ["requirements-build.txt"]binaryblock specifying torch and faiss-cpu for x86_64 and aarch64 architecturesHowever, this depends on the validity of
requirements-build.txt, which currently contains critical duplicate package versions (see separate review comment on requirements-build.txt). Those duplicates must be resolved before this pipeline configuration can succeed.Please verify that the prefetch-dependencies task (line 209-233) can successfully process the new multi-entry prefetch-input structure and that it properly handles the binary block configuration.
requirements.hermetic.txt
Outdated
| # List the build tools required for hermetic builds. | ||
| # Manually edited, hashes are not required. | ||
| uv==0.9.16 | ||
| pip==25.3 | ||
| pip==25.3 | ||
| patchelf==0.17.2.4 | ||
| cmake==4.2.0 | ||
| ninja==1.13.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
latest stable release uv python package manager December 2025
💡 Result:
The latest stable release of the uv Python package is v0.9.17, published on Dec 9, 2025. [1][2]
Sources:
- PyPI release page for uv (uv‑0.9.17 uploaded Dec 9, 2025). [1]
- astral-sh/uv GitHub repository (project homepage). [2]
🌐 Web query:
latest stable release patchelf cmake ninja December 2025
💡 Result:
- CMake — latest stable: 4.2.1 (released Dec 10, 2025). [1]
- patchelf — latest upstream stable: 0.18.0 (tagged 0.18.0). [2]
- Ninja — latest stable: v1.13.2 (released Nov 20, 2025). [3]
Sources:
[1] CMake announcement / download page.
[2] NixOS / patchelf releases (GitHub).
[3] ninja-build releases (GitHub).
🌐 Web query:
pip latest stable version December 2025
💡 Result:
The latest stable pip release as of December 11, 2025 is pip 25.3 (released Oct 25, 2025). [1][2]
Sources:
[1] PyPI pip project — release files (pip 25.3 uploaded Oct 25, 2025).
[2] pip release process / cadence (quarterly releases).
Update build tool versions to current stable releases.
The pinned versions are partially outdated:
- uv==0.9.16 → update to 0.9.17 (released Dec 9, 2025)
- pip==25.3 is current ✓
- patchelf==0.17.2.4 → update to 0.18.0 (significant version gap)
- cmake==4.2.0 → update to 4.2.1 (released Dec 10, 2025)
- ninja==1.13.0 → update to 1.13.2 (released Nov 20, 2025)
Updating to the latest stable releases ensures you benefit from recent fixes and improvements without sacrificing reproducibility.
🤖 Prompt for AI Agents
In requirements.hermetic.txt around lines 1 to 7, the pinned build-tool versions
are outdated; update the pins to the requested stable releases: set uv==0.9.17,
keep pip==25.3, set patchelf==0.18.0, set cmake==4.2.1, and set ninja==1.13.2;
after updating, run your hermetic build/test flow (and regenerate any dependent
lockfiles or build artifacts) to confirm compatibility.
96c07b7 to
918386f
Compare
|
/retest |
016632f to
e775f64
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
rpms.lock.yaml (1)
501-507: Architectural inconsistency: libxcrypt-compat missing for aarch64.This issue was already flagged in a previous review. The
libxcrypt-compatpackage is present only for x86_64 but not for aarch64, creating architectural asymmetry. The aarch64 appstream section should include a correspondinglibxcrypt-compat-4.4.18-3.el9.aarch64.rpmentry.requirements-build.txt (1)
19-30: Duplicate package versions still present in generated requirements file.This critical issue was already flagged in previous reviews. The file contains conflicting pinned versions:
cython: 3.1.7 and 3.2.3hatchling: 1.26.3 and 1.28.0meson-python: 0.16.0 and 0.18.0setuptools-scm: 7.1.0 and 9.2.2The pybuild-deps merge of architecture-specific requirements did not deduplicate conflicting versions.
Also applies to: 81-122, 138-146, 213-230
Containerfile (1)
42-49: Filename mismatch:requirements-binary.txtshould berequirements.binary.txt.This issue was flagged in a previous review. Line 46 references
requirements-binary.txt(with hyphen), but based on the repository's naming convention with other requirements files (requirements.*.txt), the actual file is likelyrequirements.binary.txt(with dot). This will cause pip install to fail.
🧹 Nitpick comments (2)
.tekton/lightspeed-stack-push.yaml (1)
67-68: Missinghermeto-modeparameter declaration in pipelineSpec.The
hermeto-modeparameter is defined at the PipelineRun level (lines 67-68) and passed to theprefetch-dependenciestask (lines 221-222), but it's not declared in thepipelineSpec.paramssection (lines 77-152). While this may work due to Tekton's parameter passthrough behavior, explicitly declaring it improves clarity and enables proper validation.🔎 Suggested addition to pipelineSpec.params
Add after line 112 (after the
hermeticparameter):- default: "" description: Mode for hermeto dependency prefetching (e.g., permissive, strict) name: hermeto-mode type: stringAlso applies to: 221-222
.tekton/lightspeed-stack-pull-request.yaml (1)
70-71: Missinghermeto-modeparameter declaration in pipelineSpec.Same issue as in
lightspeed-stack-push.yaml: thehermeto-modeparameter is used at the PipelineRun level and passed toprefetch-dependencies, but not declared inpipelineSpec.params(lines 80-157). Add the parameter declaration for consistency and proper validation.🔎 Suggested addition to pipelineSpec.params
Add after line 111 (after the
hermeticparameter):- default: "" description: Mode for hermeto dependency prefetching (e.g., permissive, strict) name: hermeto-mode type: stringAlso applies to: 230-231
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
💤 Files with no reviewable changes (2)
- requirements.x86_64.txt
- requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
- requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (5)
- artifacts.lock.yaml
- pyproject.toml
- scripts/remove_torch_deps.sh
- requirements.hermetic.txt
- Makefile
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / azure
🔇 Additional comments (6)
rpms.in.yaml (1)
1-1: LGTM - cmake addition aligns with build requirements.The addition of
cmaketo the package list is consistent with the Containerfile changes that use cmake to build googletest from source during hermetic builds.rpms.lock.yaml (1)
7-34: LGTM - cmake packages properly added for both architectures.The cmake-related packages (cmake, cmake-data, cmake-filesystem, cmake-rpm-macros) are consistently added for both aarch64 (lines 7-34) and x86_64 (lines 417-444), ensuring architectural parity for build tooling.
Also applies to: 417-444
Containerfile (2)
18-18: LGTM - cmake installation for build tooling.Adding
cmaketo the dnf install is necessary for building googletest from source in the hermetic build flow.
28-35: The error handling concern is incorrect; proper error detection is already in place.The current code uses
&&operators between all commands within the if block (tar&&cd&&cmake&&make&&make install), which ensures early failure detection if any step fails. The final semicolon beforefiis correct shell syntax for terminating the compound command.However, the suggested cleanup step is valuable: adding
rm -rf /tmp/googletest-release-1.12.1 &&before the final semicolon would reduce the image layer size by removing the temporary build directory after installation.The googletest artifact (
googletest-release-1.12.1.tar.gz) is correctly defined inartifacts.lock.yaml.Likely an incorrect or invalid review comment.
.tekton/lightspeed-stack-push.yaml (1)
33-64: Well-structured prefetch-input configuration.The multi-entry JSON array properly separates rpm, pip, and generic prefetch types. The pip entry correctly includes the new requirements files and binary package configuration for hermetic builds.
.tekton/lightspeed-stack-pull-request.yaml (1)
36-67: Prefetch configuration consistent with push pipeline.The prefetch-input structure correctly mirrors the push.yaml configuration, ensuring consistent behavior between pull request and push pipelines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (5)
Containerfile (1)
43-47: Critical: Verify the requirements file naming convention.Line 47 references
requirements-binary.txt(with hyphen). Past review comments indicate this file doesn't exist and should berequirements.binary.txt(with dot). Although a previous review marked this as "Addressed in commit e775f64", the reference still uses the hyphen format. Verify which filename is correct to avoid build failures.#!/bin/bash # Description: Check which requirements files actually exist in the repository echo "=== Find all requirements files with 'binary' in the name ===" fd -t f "requirements.*binary.*\.txt" || echo "No files found with fd" echo -e "\n=== List all requirements*.txt files ===" fd -t f "^requirements.*\.txt$" echo -e "\n=== Check Containerfile references ===" rg -n "requirements.*binary" Containerfilescripts/remove_torch_deps.sh (1)
33-42: Past review issues remain unaddressed: empty-line handling bug in both sections.Previous reviews identified that the AWK logic in both the torch removal section (lines 33-42) and the newly added faiss-cpu removal section (lines 50-76) incorrectly exits the section when encountering empty lines. Lines 39 and 67 use the pattern
!/^ /which treats any non-indented line—including blank lines—as a section terminator, potentially leaving orphaned hash lines.🔎 Proposed fix for both sections
Apply this fix to the torch section:
+# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + # If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { +in_torch_section == 1 && !/^ / && !/^$/ { in_torch_section = 0 # Fall through to print this line }Apply the same fix to the faiss section:
+# If we are in faiss section and line is empty, skip it +in_faiss_section == 1 && /^$/ { + next # Skip empty line +} + # If we are in faiss section and line does NOT start with 4 spaces, exit faiss section -in_faiss_section == 1 && !/^ / { +in_faiss_section == 1 && !/^ / && !/^$/ { in_faiss_section = 0 # Fall through to print this line }Also applies to: 50-76
rpms.lock.yaml (1)
550-556: Architectural inconsistency: libxcrypt-compat missing for aarch64.The
libxcrypt-compat-4.4.18-3.el9package is added only to the x86_64 appstream section (lines 550-556), while the aarch64 appstream section lacks a corresponding entry. Sincelibxcrypt-develexists in both architectures (aarch64 at lines 105-111, x86_64 at lines 557-563), the absence of the compat variant for aarch64 creates an architectural inconsistency that may cause compatibility issues.🔎 Recommended fix
Add the aarch64 variant of libxcrypt-compat to the aarch64 appstream section (after line 111):
- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm repoid: ubi-9-for-aarch64-appstream-rpms size: [appropriate_size] checksum: sha256:[appropriate_checksum] name: libxcrypt-compat evr: 4.4.18-3.el9 sourcerpm: libxcrypt-4.4.18-3.el9.src.rpmNote: You'll need to obtain the correct size and checksum values for the aarch64 package.
Makefile (1)
117-119: Drop--universalwhen compiling faiss-cpu requirements (same issue as prior review).
faiss-cpuships platform‑specific binary wheels, so combining it withuv pip compile ... --universalis semantically inconsistent and was already flagged earlier; keep the per‑Python resolution and remove--universalwhile continuing to use 3.12 as the target. Torch line andpybuild-depsusage look fine.Suggested Makefile adjustment
- echo "torch==${TORCH_VERSION}" | uv pip compile - -o requirements.torch.txt --generate-hashes --python-version 3.12 --torch-backend cpu --emit-index-url --no-deps --index-url https://download.pytorch.org/whl/cpu --refresh - echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal + echo "torch==${TORCH_VERSION}" | uv pip compile - -o requirements.torch.txt --generate-hashes --python-version 3.12 --torch-backend cpu --emit-index-url --no-deps --index-url https://download.pytorch.org/whl/cpu --refresh + echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --refreshrequirements-build.txt (1)
7-256: Duplicate package versions remain unresolved.The critical issue previously identified persists: multiple conflicting versions of the same packages (cython, hatchling, meson-python, setuptools-scm) are still present in this autogenerated file. This indicates the pybuild-deps merge process has not been corrected to deduplicate architecture-specific requirements.
🧹 Nitpick comments (2)
Containerfile (1)
29-36: Minor typo in comment.Line 29 comment says "gookletest" but the actual path correctly references "googletest". Consider fixing the typo for clarity.
🔎 Proposed fix
-# Extract generic dependencies from artifacts.lock.yaml and install them +# Extract and build googletest from generic dependenciesrequirements.aarch64.txt (1)
1935-1937: Faiss-cpu no longer pinned top‑level here; confirm this matches the new hermetic/binary strategy
numpyis still recorded as being pulled in “via faiss-cpu”, but there is no explicitfaiss-cpu==…block in this compiled requirements file anymore. Given the prior practice in this repo of pinning transitive deps likefaiss-cpuas top‑level entries to avoid drift, please double‑check that:
- The new
requirements.binary.txt/FAISS_VERSIONworkflow is now the single source of truth forfaiss-cpuversions across arches, and- You’re comfortable that dev/CI environments that install from
requirements.aarch64.txtwon’t silently pick up a different faiss-cpu version than what the binary pipeline expects.If those guarantees hold, a short comment/ADR reference in the build tooling noting the deliberate shift away from pinning
faiss-cpuhere would prevent future “clean‑up” PRs from trying to re-add it.Based on learnings, ...
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (5)
- requirements.hermetic.txt
- artifacts.lock.yaml
- requirements.binary.txt
- rpms.in.yaml
- .tekton/lightspeed-stack-pull-request.yaml
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (3)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
Makefilerequirements.x86_64.txtrequirements.aarch64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfilepyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones
Applied to files:
pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
🔇 Additional comments (5)
requirements.aarch64.txt (1)
2087-2178: Pillow bump to 12.1.0 with refreshed hashes looks good; validate cross‑arch/tooling alignmentThe version+hash set is consistent with a fresh uv compile and is fine from this file’s perspective. Please just verify that:
requirements.x86_64.txtand any image/venv build paths use the same Pillow version, and- Nothing in your image/build pipeline assumes the previous Pillow minor (e.g., prebuilt wheels, cached layers).
requirements.x86_64.txt (1)
2087-2183: Pillow bump and hashes look consistent.The upgrade to
pillow==12.1.0with regenerated hashes is self‑contained and matches the existing usage paths; no issues from this file’s perspective.Makefile (1)
9-10: Confirm FAISS_VERSION change is intentional.You’re now pinning
faiss-cpuviaFAISS_VERSION := 1.13.0and generating it separately; given this project’s habit of tightly pinning faiss-cpu, please double‑check the version choice (especially if it was previously 1.13.2) and that all consumers now pull fromrequirements.binary.txtas intended.Based on learnings, this repo deliberately pins faiss-cpu to control the dependency graph, so version shifts are worth an explicit check.
.tekton/lightspeed-stack-push.yaml (2)
221-222: Parameter passing is syntactically correct.The
modeparameter correctly references$(params.hermeto-mode). This will pass the valuepermissiveto the prefetch-dependencies task, assuming the hermeto-mode parameter is valid (see comment on lines 67-68 for verification).
67-68: Verify that the prefetch-dependencies-oci-ta:0.2 task accepts the "mode" parameter.The
hermeto-modeparameter is being passed to the prefetch-dependencies-oci-ta task as amodeparameter (line 221–222), but this parameter does not appear in the documented parameters for task-prefetch-dependencies-oci-ta v0.2. Confirm:
- Whether the task actually accepts this parameter
- If this is intended or a misconfiguration
- Whether the value
permissiveis valid for this context
| "pybuild-deps==0.5.0", | ||
| "pip==24.3.1", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, locate and read the pyproject.toml file to understand the context
find . -name "pyproject.toml" -type f | head -1Repository: lightspeed-core/lightspeed-stack
Length of output: 92
🏁 Script executed:
# Read the pyproject.toml file around lines 125-126 to see the context
wc -l pyproject.toml && sed -n '115,135p' pyproject.tomlRepository: lightspeed-core/lightspeed-stack
Length of output: 612
🌐 Web query:
pybuild-deps 0.5.0 PyPI package
💡 Result:
pybuild-deps 0.5.0 — summary
- Package: pybuild-deps. Purpose: CLI tools to detect PEP-517 build dependencies / generate pinned build requirements. Install: pip install pybuild-deps. [1][2]
- Latest release: 0.5.0 (released Mar 15, 2025). [1]
- Files uploaded: sdist pybuild_deps-0.5.0.tar.gz (26.4 kB) and wheel pybuild_deps-0.5.0-py3-none-any.whl (29.3 kB). Uploaded via twine; Trusted Publishing: Yes. [1]
- Hashes (SHA256):
- sdist: fa488db42cc53f93926ccb55ef56fb300fbd7769d31a56ebc7f83f11e28aeac8
- wheel: 4cc5b8634b5aac371755a7ff33da1f47cf528938e419c1fb943cc95a8c3337e7. [1]
- License: GPL-3.0. [1][2]
- Docs: command-line reference and usage on ReadTheDocs. [2]
- Security/status: listed as v0.5.0 with no known vulnerabilities on Safety DB (summary). [3]
Sources:
[1] PyPI project page (pybuild-deps) — release history, files, hashes, upload date.
[2] pybuild-deps documentation (ReadTheDocs) — description and usage.
[3] Safety DB package summary for pybuild-deps.
🌐 Web query:
pip 24.3.1 PyPI release history
💡 Result:
pip 24.3.1 — released 2024-10-27. It’s documented in the pip changelog (24.3.1: “Allow multiple nested inclusions of the same requirements file again”) and listed on the PyPI project page. [1][2]
Sources:
- pip changelog / news (24.3.1 entry). [1]
- pip on PyPI (release history). [2]
🌐 Web query:
pybuild-deps 0.5.0 security vulnerability CVE
💡 Result:
I found no CVE or public vulnerability record for pybuild-deps 0.5.0. Sources: Safety DB shows "No known vulnerabilities" for pybuild-deps 0.5.0 [1], and the PyPI release page for 0.5.0 (Mar 15, 2025) has no security advisory listed [2].
References:
- Safety DB — pybuild-deps package page. [1]
- PyPI — pybuild-deps 0.5.0 release page. [2]
If you want, I can run a deeper scan (OSV/NVD/GitHub Advisory searches) and report back with any related findings.
🌐 Web query:
pip 24.3.1 security vulnerability
💡 Result:
Short answer: Yes — pip 24.3.1 is known to have security issues. Specifically:
- Arbitrary code execution risk affecting pip versions from 24.1b1 up to (but not including) 25.0; 24.3.1 falls in that range. Upgrade to ≥25.0 to remediate this. [1][2]
- A symlink / arbitrary file-write vulnerability in pip’s fallback sdist extraction affects versions <25.2 (including 24.3.1) when running on Python releases that don’t implement PEP 706; upgrade to ≥25.2 and/or use a Python version that implements PEP 706 to mitigate. [1][3]
Recommendation: upgrade pip to the latest fixed release (at least 25.2, or 25.0+ to address the code-execution issue) and audit any environments where pip 24.x was used. [1][3]
Sources:
[1] Snyk — pip vulnerabilities (lists Arbitrary Code Execution and Symlink issues).
[2] Snyk advisory — Arbitrary Code Execution in pip (disclosed Feb 2025).
[3] CVE-2025-8869 / vendor advisories describing symlink/arbitrary file-write (affects pip <25.2).
Upgrade pip to version 25.2 or later due to critical security vulnerabilities.
pybuild-deps==0.5.0 is valid and has no known vulnerabilities. However, pip==24.3.1 has two critical security issues:
- Arbitrary code execution vulnerability affecting versions 24.1b1 through <25.0
- Symlink/arbitrary file-write vulnerability affecting versions <25.2
Upgrade pip to at least 25.2 to address both vulnerabilities.
🤖 Prompt for AI Agents
In pyproject.toml around lines 125-126, the pinned dependency pip==24.3.1 must
be updated to a secure release; change the pip entry to at least pip==25.2 (or a
later stable 25.x/26.x version) in the dependencies list, then regenerate your
lockfile/virtual env (e.g., run poetry lock/install or pip-tools sync) and run
your test suite/CI to ensure no breakage from the pip upgrade.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (3)
requirements-build.txt (1)
19-31: Resolve duplicate/conflicting pins for core build tools before relying on this file.This file still contains conflicting versions of the same packages, which makes it an unsafe single source of truth for build deps:
cython: 3.1.7 and 3.2.3hatchling: 1.26.3 and 1.28.0meson-python: 0.16.0 and 0.18.0setuptools-scm: 7.1.0 and 9.2.2Pip will only install one version per env, so this reflects a broken merge of the arch‑specific inputs rather than a coherent build lock. The generator/merge step for
requirements-build.txtshould dedupe by package name and select a single version (or keep separate per‑arch build requirement files / use-cconstraints instead of merging).Until that’s fixed and this file is regenerated, you can’t reliably treat it as a hermetic build manifest.
#!/bin/bash # Quick check for remaining duplicate pins in requirements-build.txt awk '{print $1}' requirements-build.txt \ | grep -E '^[a-zA-Z0-9_.-]+==[0-9]' \ | sed 's/==.*$//' \ | sort | uniq -dAlso applies to: 81-100, 138-147, 213-230
.tekton/lightspeed-stack-push.yaml (1)
34-64: Filename inconsistency and unresolved dependency issues.Two concerns:
Filename mismatch: Line 48 references
requirements.binary.txt(dot), but Containerfile line 47 usesrequirements-binary.txt(hyphen). These must match for the build to succeed.Previous review flagged duplicate versions: Past review comments indicate
requirements-build.txt(line 51) contains duplicate package versions that will cause build failures.Verify both issues are resolved:
#!/bin/bash # Verify filename consistency and check for duplicate versions echo "=== Check which binary requirements file exists ===" fd "requirements.*binary.*\.txt" -t f echo -e "\n=== Check requirements-build.txt for duplicate versions ===" if [ -f "requirements-build.txt" ]; then echo "Content of requirements-build.txt:" cat requirements-build.txt | grep -v '^#' | grep -v '^$' | sort | uniq -d | head -20 echo -e "\nChecking for duplicate package names (ignoring versions):" cat requirements-build.txt | grep -v '^#' | grep -v '^$' | sed 's/[<>=].*//' | sort | uniq -c | awk '$1 > 1' else echo "requirements-build.txt not found" firpms.lock.yaml (1)
550-556: Architectural parity: libxcrypt-compat missing for aarch64.
libxcrypt-compat(4.4.18-3.el9) is present only for x86_64, whilelibxcrypt-develexists for both architectures. This creates an architectural inconsistency that may cause runtime compatibility issues on aarch64.Verify whether libxcrypt-compat is needed for aarch64:
#!/bin/bash # Check if libxcrypt-compat is available for aarch64 and if it's referenced elsewhere echo "=== Search for libxcrypt-compat in repository ===" rg -n "libxcrypt-compat" --type yaml echo -e "\n=== Check if aarch64 variant exists in CDN ===" cat << 'EOF' Expected aarch64 package URL (if it exists): https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm This needs to be verified by checking the UBI repository or testing the build on aarch64. EOF echo -e "\n=== Check Containerfile and requirements for libxcrypt usage ===" rg -n "libxcrypt" Containerfile requirements*.txt 2>/dev/null || echo "No references found"
🧹 Nitpick comments (2)
requirements.aarch64.txt (1)
2087-2183: Pillow bump looks fine; please confirm faiss‑cpu pinning strategy.The
pillow==12.1.0pin with regenerated hashes looks consistent for a uv‑compiled lock. Given earlier practice in this repo to pin transitive deps likefaiss-cpuat the top level for better graph control, and that this lock now appears to rely on it only transitively, it would be good to confirm whether:
faiss-cpuis still explicitly pinned somewhere appropriate (e.g., pyproject / another requirements file), or- the intent is to let it float as a transitive dep going forward.
Also worth double‑checking that the x86_64 requirements file has the same
pillow==12.1.0pin to keep multi‑arch behavior aligned.Containerfile (1)
29-36: Consider cleanup and error handling for googletest build.The googletest extraction and build logic is sound for hermetic builds. However, consider these optional improvements:
- The
/tmp/googletest-release-1.12.1directory is never cleaned up- No error handling if cmake or make fails
- Consider using
&&chaining withset -eto fail fast on errors🔎 Optional improvement with cleanup and error handling
# Extract generic dependencies from artifacts.lock.yaml and install them RUN if [ -f /cachi2/cachi2.env ]; then \ + set -e && \ tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ cd /tmp/googletest-release-1.12.1 && \ cmake . && \ make && \ - make install; \ + make install && \ + cd / && \ + rm -rf /tmp/googletest-release-1.12.1; \ fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (7)
- pyproject.toml
- requirements.x86_64.txt
- .tekton/lightspeed-stack-pull-request.yaml
- rpms.in.yaml
- scripts/remove_torch_deps.sh
- Makefile
- requirements.binary.txt
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.aarch64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / vertexai
🔇 Additional comments (8)
requirements.hermetic.txt (1)
1-9: Hermetic tool pinning looks consistent; keep aligned with build tooling.List is clear and self‑documenting; just ensure any future changes to build tooling (e.g., pybuild‑deps / container image) update this file in lockstep so hermetic builds remain reproducible.
artifacts.lock.yaml (1)
1-7: Artifact lock format looks good; verify checksum/URL stay in sync with build.YAML structure is fine for a single googletest artifact. Just ensure the
download_url,checksum, andfilenamestay synchronized with whatever the Containerfile / Tekton prefetch actually fetches, so hermetic artifact verification doesn’t silently drift.Containerfile (1)
18-19: LGTM: Build tooling expansion aligns with hermetic build requirements.Adding cmake and cargo to support building dependencies (particularly Rust-based Python packages via maturin) is appropriate for the hermetic build workflow.
.tekton/lightspeed-stack-push.yaml (1)
67-68: LGTM: hermeto-mode parameter properly declared and propagated.The new
hermeto-modeparameter with value'permissive'is correctly declared and passed to theprefetch-dependenciestask. This aligns with the PR objective to support different hermetic build modes.Also applies to: 221-222
rpms.lock.yaml (4)
7-41: LGTM: Cargo and CMake additions support hermetic build requirements.The addition of cargo (1.88.0) and cmake (3.26.5) packages for aarch64 aligns with the Containerfile changes and supports building Rust-based dependencies and C/C++ projects in the hermetic build environment.
70-76: LGTM: Kernel headers version bump.Updated kernel-headers to 5.14.0-611.16.1.el9_7 for aarch64, which is a standard package version increment.
112-125: LGTM: LLVM packages support expanded build toolchain.Added llvm-filesystem and llvm-libs (20.1.8) for aarch64, which supports the hermetic build toolchain expansion.
147-160: LGTM: Rust packages match cargo version.Added rust and rust-std-static (1.88.0) for aarch64, matching the cargo 1.88.0 version. This completes the Rust toolchain for hermetic builds.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (4)
requirements-build.txt (1)
7-329: CRITICAL: Duplicate package versions remain unresolved (previously reported).This file still contains duplicate pinned versions of packages that will cause pip installation failures:
cython: lines 19 and 21hatchling: lines 81 and 86meson-python: lines 138 and 140setuptools-scm: lines 213 and 215This issue was previously reported and must be resolved before merging. The pybuild-deps compilation needs to deduplicate these conflicting versions.
Containerfile (1)
43-47: CRITICAL: Incorrect filename will cause build failure (previously reported).Line 47 references
requirements-binary.txt(with hyphen), but the correct filename in the repository isrequirements.binary.txt(with dot). This mismatch will cause the pip install command to fail.Note: This issue was previously reported and marked as "Addressed in commit e775f64", but the incorrect filename still appears in the current code.
🔎 Required fix
- pip install --no-index --find-links ${PIP_FIND_LINKS} --no-binary :all: -r requirements.$(uname -m).txt && pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements-binary.txt -r requirements.torch.txt; \ + pip install --no-index --find-links ${PIP_FIND_LINKS} --no-binary :all: -r requirements.$(uname -m).txt && pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements.binary.txt -r requirements.torch.txt; \.tekton/lightspeed-stack-push.yaml (1)
50-51: Dependency on requirements-build.txt with unresolved critical issues (previously reported).Line 51 references
requirements-build.txtwhich contains duplicate package versions that will cause build failures. This was extensively documented in previous reviews and remains unresolved.rpms.lock.yaml (1)
550-556: Architectural parity issue: libxcrypt-compat still missing for aarch64.This is a repeat of a previous review comment. The
libxcrypt-compat-4.4.18-3.el9package remains x86_64-only (lines 550-556), whilelibxcrypt-develexists for both architectures. For consistency and to avoid potential runtime compatibility issues, add the aarch64 variant to the aarch64 appstream section (after line 111).
🧹 Nitpick comments (1)
Containerfile (1)
30-36: Potential issues with the googletest build process.The googletest extraction and build has several concerns:
- No error handling: The commands are chained without checking intermediate failures
- Build artifacts remain in /tmp: No cleanup after installation
- Build directory pollutes /tmp: The extracted source remains
- No verification: The downloaded archive isn't verified before extraction
🔎 Proposed improvements
-RUN if [ -f /cachi2/cachi2.env ]; then \ - tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ - cd /tmp/googletest-release-1.12.1 && \ - cmake . && \ - make && \ - make install; \ - fi +RUN if [ -f /cachi2/cachi2.env ]; then \ + set -e && \ + tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ + cd /tmp/googletest-release-1.12.1 && \ + cmake . && \ + make && \ + make install && \ + cd / && \ + rm -rf /tmp/googletest-release-1.12.1; \ + fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (7)
- Makefile
- scripts/remove_torch_deps.sh
- requirements.x86_64.txt
- pyproject.toml
- requirements.hermetic.txt
- requirements.binary.txt
- requirements.aarch64.txt
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / ci
🔇 Additional comments (9)
rpms.in.yaml (1)
1-1: LGTM: Build tooling additions are appropriate.The addition of
cmakeandcargoto the package list aligns with the broader hermetic build changes in this PR. These tools are referenced in the Containerfile and are necessary for building dependencies like googletest and maturin-based packages.Containerfile (1)
18-19: LGTM: Build tooling installation is appropriate.Adding
cmakeandcargois necessary to support the new build requirements, including googletest and maturin-based Python packages..tekton/lightspeed-stack-push.yaml (1)
67-68: LGTM: New hermeto-mode parameter is properly configured.The addition of the
hermeto-modeparameter with valuepermissiveis correctly defined and propagated to the prefetch-dependencies task (lines 221-222). The parameter follows Tekton conventions and integrates cleanly with the pipeline structure.artifacts.lock.yaml (1)
5-7: Consider updating googletest to a more recent version.The lock file pins googletest 1.12.1 from 2022. Newer versions are available:
- v1.16.0 (released February 2025)
- v1.17.0 (released April 2025)
Evaluate whether updating to a recent stable version is appropriate for your project. If upgrading, regenerate the SHA256 checksum in the lock file.
.tekton/lightspeed-stack-pull-request.yaml (2)
36-67: LGTM! Enhanced prefetch configuration structure.The expansion from single-line JSON to a structured multi-entry array is well-formed and enables more granular dependency management across package types (rpm, pip, generic).
230-231: LGTM! Mode parameter correctly propagated.The
modeparameter is properly wired to theprefetch-dependenciestask, passing the value fromhermeto-mode. This is the correct pattern for parameter propagation in Tekton.rpms.lock.yaml (3)
7-13: LGTM! Build tooling packages consistently added across architectures.The new build tooling packages (cargo, cmake, rust, llvm-filesystem, llvm-libs) are correctly added to both aarch64 and x86_64 appstream sections with matching EVRs. This ensures consistent build capabilities across architectures.
Also applies to: 14-41, 112-125, 147-160, 459-465, 466-493, 564-577, 599-612
98-104: LGTM! Runtime libraries consistently added across architectures.The new runtime libraries (libuv, libedit) are correctly added to both aarch64 and x86_64 with matching EVRs and appropriate repository placement (libuv in appstream, libedit in baseos).
Also applies to: 294-300, 543-549, 746-752
140-146: LGTM! Python and systemd packages consistently updated across architectures.The updates to Python 3.9.25-2.el9_7 and systemd 252-55.el9_7.7, along with the addition of python-unversioned-command, are correctly synchronized across both aarch64 and x86_64 architectures with matching EVRs.
Also applies to: 385-412, 413-433, 592-598, 837-864, 865-885
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (7)
scripts/remove_torch_deps.sh (1)
50-76: Empty-line handling issue replicated in faiss-cpu removal logic.The faiss-cpu removal block (lines 66-70) uses the same pattern as the torch removal that was flagged in previous reviews. The condition
!/^ /matches empty lines, causing premature section exit if blank lines appear betweenfaiss-cpu==<version>and its hash continuation lines.Both the torch section (lines 38-42) and this faiss section need the fix to explicitly handle empty lines.
🔎 Proposed fix for both sections
# If we are in torch section and line starts with 4 spaces, skip it in_torch_section == 1 && /^ / { next # Skip this line } +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + # If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { +in_torch_section == 1 && !/^ / && !/^$/ { in_torch_section = 0 # Fall through to print this line }Apply the same pattern to the faiss-cpu section (lines 61-70).
requirements-build.txt (4)
19-31: Duplicate cython versions still present in autogenerated file.Despite being flagged in previous reviews, the file still contains conflicting versions:
cython==3.1.7(line 19)cython==3.2.3(line 21)This indicates the pybuild-deps compilation process is not deduplicating when merging architecture-specific requirements.
81-122: Duplicate hatchling versions create unpredictable build behavior.The file contains:
hatchling==1.26.3(line 81)hatchling==1.28.0(line 86)While pip will silently use the last version, this indicates a problem with the merge strategy that should be addressed in the generation script.
138-146: Duplicate meson-python versions.
meson-python==0.16.0(line 138)meson-python==0.18.0(line 140)
213-230: Duplicate setuptools-scm versions.
setuptools-scm==7.1.0(line 213)setuptools-scm==9.2.2(line 215)The pybuild-deps merge process needs to be fixed to deduplicate by selecting one compatible version per package across architectures before this file can be reliably used.
.tekton/lightspeed-stack-pull-request.yaml (1)
70-71: Add hermeto-mode parameter definition to pipelineSpec.params.The
hermeto-modeparameter is defined inspec.params(lines 70-71) but missing frompipelineSpec.params(lines 80-157). This was flagged in a previous review and remains unaddressed. Tekton will fail to resolve$(params.hermeto-mode)at runtime.🔎 Proposed fix
Add the following parameter definition to
pipelineSpec.params(after line 157, before theresultssection):- name: hermeto-mode default: 'permissive' description: Hermetic mode configuration for prefetch-dependencies type: stringrpms.lock.yaml (1)
550-556: libxcrypt-compat still missing from aarch64 appstream.As flagged in a previous review,
libxcrypt-compat-4.4.18-3.el9is present in x86_64 appstream (lines 550-556) but absent from the aarch64 appstream section. The aarch64 section only haslibxcrypt-devel(lines 105-111). This architectural asymmetry may cause runtime compatibility issues on aarch64 builds.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
✅ Files skipped from review due to trivial changes (1)
- artifacts.lock.yaml
🚧 Files skipped from review as they are similar to previous changes (7)
- rpms.in.yaml
- requirements.binary.txt
- requirements.x86_64.txt
- Makefile
- Containerfile
- requirements.aarch64.txt
- pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / vertexai
🔇 Additional comments (6)
scripts/remove_torch_deps.sh (1)
78-85: Good addition of diff verification step.The two-pass approach with temporary files and the final diff verification improves visibility into what dependencies were removed.
.tekton/lightspeed-stack-push.yaml (2)
34-64: Expanded prefetch-input structure looks correct.The multi-entry JSON array for prefetch-input properly structures rpm, pip, and generic dependency prefetching with appropriate configuration for binary packages.
221-222: Mode parameter wiring to prefetch-dependencies task is correct.The mode parameter correctly references the hermeto-mode parameter value, assuming the pipelineSpec.params declaration is added.
Ensure the
prefetch-dependencies-oci-tatask (bundle version 0.2) supports themodeparameter..tekton/lightspeed-stack-pull-request.yaml (1)
37-67: Prefetch-input structure mirrors push pipeline configuration.The expanded multi-entry JSON array is consistent with the push pipeline, maintaining parity between PR and push workflows.
rpms.lock.yaml (2)
7-160: Comprehensive aarch64 package updates look good.The additions of cargo, cmake suite, rust toolchain, llvm, and updated Python/system packages properly align aarch64 with the hermetic build requirements. Cross-arch parity is maintained for most packages.
459-612: x86_64 package updates are consistent with aarch64.The matching package set (cargo, cmake, rust, llvm, Python) for x86_64 maintains architectural parity with the aarch64 updates.
| # List the build tools required for hermetic builds. | ||
| # Manually edited, hashes are not required. | ||
| uv==0.9.21 | ||
| pip==25.3 | ||
| patchelf==0.17.2.4 | ||
| cmake==4.2.0 | ||
| ninja==1.13.0 | ||
| puccinialin==0.1.8 | ||
| platformdirs==4.5.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
uv Python package manager latest stable version 2025 2026
💡 Result:
The latest stable uv release (as of Jan 2, 2026) is 0.9.17 — released December 9, 2025. [1]
Sources:
- GitHub releases: astral-sh/uv (release 0.9.17, released 2025-12-09). [1]
🌐 Web query:
puccinialin Python package PyPI
💡 Result:
Yes — puccinialin is a real Python package on PyPI. It bootstraps a temporary Rust toolchain (rustup/cargo/rustc) for building Rust-based Python extensions and exposes a setup_rust() helper and a CLI. Latest release on PyPI is 0.1.8 (uploaded Nov 19, 2025). Install with:
pip install puccinialin
Sources: PyPI project page and package listing. [1][2]
References
[1] puccinialin — PyPI project page.
[2] puccinialin — AOSC package listing (references PyPI upstream).
🌐 Web query:
pip latest stable version January 2026
💡 Result:
The latest stable pip release as of January 2, 2026 is pip 25.3 (released Oct 25, 2025). [1][2]
Sources:
- PyPI pip release listing showing pip 25.3. [1]
- pip release cadence/documentation. [2]
🌐 Web query:
cmake latest stable version 2025 2026
💡 Result:
The latest stable CMake release is 4.2.1 (released December 9, 2025). [1][2]
Sources:
[1] CMake announcement for 4.2.0 and downloads.
[2] Kitware software releases listing (CMake 4.2.1 — Dec 9, 2025).
🌐 Web query:
ninja build latest stable version 2025 2026
💡 Result:
The latest stable Ninja (build system) release is v1.13.2 — released on Nov 20, 2025. [1]
Sources:
[1] ninja-build GitHub releases (v1.13.2).
🌐 Web query:
patchelf latest stable version 2025 2026
💡 Result:
The latest upstream stable release of patchelf is 0.18.0 (released Apr 23, 2023). [1][2][3]
Sources:
- GitHub releases (patchelf 0.18.0). [1]
- Debian/Ubuntu package records showing patchelf 0.18.0 in 2024–2025. [2][3]
🌐 Web query:
platformdirs Python package latest stable version 2025 2026
💡 Result:
The latest stable release is platformdirs 4.5.1 (uploaded to PyPI on Dec 5, 2025). [1][2]
Sources:
[1] PyPI project page (platformdirs 4.5.1 upload details).
[2] piwheels / release listing showing platformdirs-4.5.1.
Update patchelf, cmake, and ninja to their latest stable versions.
Three dependencies are outdated:
patchelf==0.17.2.4→ update to0.18.0(latest as of April 2023)cmake==4.2.0→ update to4.2.1(released December 9, 2025)ninja==1.13.0→ update to1.13.2(released November 20, 2025)
puccinialin==0.1.8 is confirmed as a legitimate package and is already at the latest version. pip==25.3 and platformdirs==4.5.1 are also current as of January 2026.
Note: uv==0.9.21 appears ahead of the tracked latest release (0.9.17). Verify this version is intentional or correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (4)
rpms.lock.yaml (1)
550-556: libxcrypt-compat is only present for x86_64, missing from aarch64.This architectural asymmetry was flagged in a previous review. The
libxcrypt-compat-4.4.18-3.el9package exists only in the x86_64 appstream section (lines 550-556), while the corresponding aarch64 section lacks this entry. For build parity across architectures, add the aarch64 variant.requirements-build.txt (1)
19-31: Duplicate package versions remain in requirements-build.txt.The file still contains conflicting pinned versions despite previous review feedback:
cython: 3.1.7 (line 19) and 3.2.3 (line 21)hatchling: 1.26.3 (line 81) and 1.28.0 (line 86)meson-python: 0.16.0 (line 138) and 0.18.0 (line 140)setuptools-scm: 7.1.0 (line 213) and 9.2.2 (line 215)While pip will silently use the last version encountered, this indicates the pybuild-deps merge is not deduplicating properly. Resolve by selecting one version per package or use separate architecture-specific build requirement files.
Containerfile (1)
43-50: Filename mismatch:requirements-binary.txtshould berequirements.binary.txt.Line 47 references
requirements-binary.txt(with hyphen), but based on past review comments, the actual file in the repository isrequirements.binary.txt(with dot). The COPY directive on line 27 uses the globrequirements.*.txtwhich would matchrequirements.binary.txtbut notrequirements-binary.txt. This will cause the pip install to fail.🔎 Proposed fix
- uv pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements-binary.txt -r requirements.torch.txt; \ + uv pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements.binary.txt -r requirements.torch.txt; \.tekton/lightspeed-stack-push.yaml (1)
67-68: Missinghermeto-modeparameter definition inpipelineSpec.params.The
hermeto-modeparameter is defined inspec.params(lines 67-68) and referenced as$(params.hermeto-mode)in the prefetch-dependencies task (lines 221-222), but it's missing frompipelineSpec.params(lines 77-150). Tekton requires all parameters referenced via$(params.*)to be declared in the pipeline spec.🔎 Proposed fix
Add to
pipelineSpec.paramsafter line 150:- name: hermeto-mode default: 'permissive' description: Hermetic mode configuration for prefetch-dependencies type: stringAlso applies to: 221-222
🧹 Nitpick comments (1)
Containerfile (1)
29-36: Consider cleaning up googletest build artifacts after installation.The googletest extraction and build leaves
/tmp/googletest-release-1.12.1in the image. Since this is a multi-stage build and the builder stage is discarded, this is not critical, but cleaning up would reduce intermediate image size during builds.🔎 Proposed improvement
RUN if [ -f /cachi2/cachi2.env ]; then \ tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ cd /tmp/googletest-release-1.12.1 && \ cmake . && \ make && \ - make install; \ + make install && \ + rm -rf /tmp/googletest-release-1.12.1; \ fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (6)
- artifacts.lock.yaml
- Makefile
- scripts/remove_torch_deps.sh
- requirements.binary.txt
- pyproject.toml
- requirements.x86_64.txt
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (7)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / ci
🔇 Additional comments (7)
rpms.in.yaml (1)
1-1: Packages list expansion looks consistent with build toolingAdding
cmakeandcargoalongsidegcc/jq/patchmatches the rest of the hermetic build changes and is reasonable for C/C++/Rust-heavy toolchains. No issues from this file alone.requirements.aarch64.txt (1)
2087-2178: Pillow bump + faiss-cpu relocation: verify pins stay centralized
pillow→12.1.0with regenerated hashes looks fine, assuming this came fromuv pip compilerather than manual edits, and that x86_64 requirements were updated in lockstep.AI summary says
faiss-cpuwas removed here and is now managed viaFAISS_VERSIONplusrequirements.binary.txt/requirements-build.txt. Given the earlier decision in this repo to pin transitive deps likefaiss-cpuexplicitly at the top level, please just confirm that:
faiss-cpuis still pinned in exactly one authoritative place (Makefile/requirements.binary/pyproject), and- both arch-specific compiled requirement files are being regenerated from that single source of truth.
Based on learnings, this keeps the dependency graph predictable across hermetic builds.
requirements.hermetic.txt (1)
1-9: Clarify and verify hermetic tooling version strategyThis file is the single manual source of truth for hermetic build tools, so the pins here matter more than in the auto‑generated requirement files.
A few points to double‑check rather than treat as bugs:
uv==0.9.21: prior bot comments thought 0.9.17 was the latest tagged release; please verify 0.9.21 is a real, intended version (e.g., not a local/dev build) and that it’s available wherever this runs.patchelf==0.17.2.4,cmake==4.2.0,ninja==1.13.0: these are slightly behind upstream suggestions from earlier reviews. That can be fine for hermetic builds if you intentionally track your base image/distro rather than “latest”; consider documenting that policy in a short comment so future PRs don’t keep chasing version bumps without a reason.puccinialinandplatformdirspins look sane, but fall under the same policy question (track upstream stable vs. known‑good combo).I’d suggest:
- Decide on the policy (e.g., “match base image versions” vs. “follow upstream stable within N months”).
- Either bump the lagging tools to your chosen targets or add a brief comment explaining why these specific versions are pinned.
rpms.lock.yaml (1)
7-160: LGTM for aarch64 appstream package additions.The new packages (cargo, cmake, emacs-filesystem, libuv, llvm-filesystem, llvm-libs, python-unversioned-command, rust, rust-std-static) are properly structured with complete metadata including URLs, checksums, EVRs, and source RPM references. Cross-architecture parity with x86_64 is maintained.
Containerfile (1)
18-19: LGTM for adding cmake and cargo.The additions are well-justified: cmake is required for the googletest build (lines 31-35), and cargo is needed for maturin-based Rust Python packages as noted in the comment.
.tekton/lightspeed-stack-push.yaml (1)
34-64: LGTM for the expandedprefetch-inputstructure.The multi-entry JSON array properly configures:
- RPM prefetching with path "."
- pip dependencies with comprehensive requirements files and build files
- Binary package specifications for torch, faiss-cpu, and uv across both x86_64 and aarch64
- Generic dependencies support
The structure aligns with the hermetic build requirements.
.tekton/lightspeed-stack-pull-request.yaml (1)
37-67: LGTM for the expandedprefetch-inputstructure.The multi-entry JSON array is consistent with the push pipeline configuration and properly defines rpm, pip, and generic prefetch entries with appropriate requirements files and binary package specifications.
bd535d3 to
c58afc8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (4)
rpms.lock.yaml (1)
550-556: Duplicate: libxcrypt-compat still missing from aarch64 appstream.The architectural parity issue flagged in the previous review remains unaddressed.
libxcrypt-compat-4.4.18-3.el9is present only in the x86_64 appstream section (lines 550–556), but the aarch64 appstream section still lacks the corresponding entry. Whilelibxcrypt-develexists for both architectures, the compat variant is x86_64-only, creating an inconsistency that may cause runtime compatibility issues across architectures..tekton/lightspeed-stack-pull-request.yaml (1)
70-71: Missinghermeto-modedeclaration inpipelineSpec.paramswill cause pipeline failure.The parameter
hermeto-modeis defined inspec.params(line 70-71) and referenced as$(params.hermeto-mode)in the prefetch-dependencies task (lines 230-231), but it is still not declared inpipelineSpec.params(lines 80-157). Tekton requires parameters to be declared in the pipeline spec for$(params.*)references to resolve.🔎 Proposed fix
Add to
pipelineSpec.paramsafter line 157 (before theresultssection):- name: hermeto-mode default: 'permissive' description: Hermetic mode configuration for prefetch-dependencies type: stringrequirements-build.txt (1)
19-30: Duplicate package versions will cause unpredictable builds.The file contains conflicting pinned versions for the same packages:
cython: 3.1.7 (line 19) and 3.2.3 (line 21)hatchling: 1.26.3 (line 81) and 1.28.0 (line 86)meson-python: 0.16.0 (line 138) and 0.18.0 (line 140)setuptools-scm: 7.1.0 (line 213) and 9.2.2 (line 215)While pip will silently use the last occurrence, having duplicate entries indicates the pybuild-deps merge did not properly deduplicate. This makes builds non-deterministic if file ordering changes. Resolve by either:
- Using separate architecture-specific build requirement files, or
- Updating the pybuild-deps compilation to select one compatible version per package
.tekton/lightspeed-stack-push.yaml (1)
67-68: Missinghermeto-modedeclaration inpipelineSpec.paramswill cause pipeline failure.The parameter
hermeto-modeis defined inspec.params(lines 67-68) and referenced as$(params.hermeto-mode)in the prefetch-dependencies task (lines 221-222), but it is not declared inpipelineSpec.params(lines 77-150). This will cause Tekton to fail to resolve the parameter reference.🔎 Proposed fix
Add to
pipelineSpec.paramsafter line 150 (before theresultssection):- name: hermeto-mode default: 'permissive' description: Hermetic mode configuration for prefetch-dependencies type: string
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (10)
- Containerfile
- artifacts.lock.yaml
- scripts/remove_torch_deps.sh
- requirements.hermetic.txt
- rpms.in.yaml
- requirements.aarch64.txt
- requirements.binary.txt
- pyproject.toml
- Makefile
- requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / vertexai
🔇 Additional comments (5)
rpms.lock.yaml (2)
7-160: Architectural parity maintained for new build tooling packages.The new packages (cargo, cmake set, emacs-filesystem, libuv, llvm-filesystem, llvm-libs, python-unversioned-command, rust, rust-std-static) have been added consistently across both aarch64 and x86_64 architectures with matching EVRs. This maintains cross-architecture consistency for the hermetic build tooling.
Also applies to: 459-612
168-454: Version updates are architecturally consistent.The updated packages (binutils, kernel-headers, systemd family, python3 family, vim-filesystem, libedit) reflect EL9.7-era revisions with matching EVRs across both aarch64 and x86_64 architectures. The URLs, checksums, and sizes are architecture-specific as expected.
Also applies to: 620-906
.tekton/lightspeed-stack-pull-request.yaml (1)
37-67: Prefetch-input structure is well-formed.The expanded multi-entry JSON array correctly defines rpm, pip (with binary configuration for multi-arch builds), and generic entries. The structure aligns with the build-platforms configuration.
requirements-build.txt (1)
257-259: Unsafe packages section is correctly structured.The
setuptools==80.9.0pinned in the unsafe section follows pip-compile conventions. This is expected and properly handled..tekton/lightspeed-stack-push.yaml (1)
34-64: Prefetch-input configuration is consistent with pull-request pipeline.The expanded prefetch-input JSON structure matches the pull-request pipeline, ensuring consistent build behavior across PR and push workflows.
| - name: mode | ||
| value: $(params.hermeto-mode) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# Search for references to prefetch-dependencies-oci-ta in the codebase
rg "prefetch-dependencies-oci-ta" --type yamlRepository: lightspeed-core/lightspeed-stack
Length of output: 58
🏁 Script executed:
# Look for any task definitions or bundle references
fd "task" . --type f | head -20Repository: lightspeed-core/lightspeed-stack
Length of output: 58
🏁 Script executed:
# Search for documentation or references about this task
rg "prefetch-dependencies" . --type yaml -A 5 -B 5Repository: lightspeed-core/lightspeed-stack
Length of output: 58
🌐 Web query:
Konflux prefetch-dependencies-oci-ta task parameters documentation
💡 Result:
Here are the documented parameters commonly used by the Konflux task prefetch-dependencies-oci-ta (example invocation shown in Konflux/Red Hat docs):
- ACTIVATION_KEY. [optional string] — activation key for private repos. [1]
- SOURCE_ARTIFACT. [OCI artifact reference] — input trusted-artifact (consumed). [1][2]
- caTrustConfigMapName / caTrustConfigMapKey. — ConfigMap name/key for CA bundle. [1]
- config-file-content. [string] — inline config content. [1]
- dev-package-managers. [bool/string] — allow developer package managers. [1]
- input. [JSON string] — packages/flags payload describing what to prefetch. [1]
- log-level. [string] — logging level (e.g. info). [1]
- ociStorage. [OCI image reference] — where to store the produced trusted-artifact (result). [1][2]
- ociArtifactExpiresAfter. [duration string] — expiry for stored OCI artifact. [1]
- sbom-type. [string] — SBOM format (e.g. spdx). [1]
References:
[1] Red Hat article showing a sample invocation of prefetch-dependencies (includes parameter names/values).
[2] Konflux documentation describing Trusted Artifacts and the -oci-ta task variants.
[3] konflux-ci build-definitions repository (task/pipeline definitions and TA variants).
🏁 Script executed:
# Check for any task definitions or references in the entire codebase
rg "mode" .tekton/lightspeed-stack-pull-request.yaml -B 3 -A 3Repository: lightspeed-core/lightspeed-stack
Length of output: 679
🏁 Script executed:
# Look for the actual task being referenced in the pipeline
rg "taskRef\|taskSpec" .tekton/lightspeed-stack-pull-request.yaml -A 5Repository: lightspeed-core/lightspeed-stack
Length of output: 58
🏁 Script executed:
# Get the full taskRef context around line 230-231
sed -n '220,240p' .tekton/lightspeed-stack-pull-request.yamlRepository: lightspeed-core/lightspeed-stack
Length of output: 834
Remove the unsupported mode parameter from the prefetch-dependencies task.
The mode parameter (line 230) is not supported by prefetch-dependencies-oci-ta:0.2. According to Konflux task documentation, this task accepts: ACTIVATION_KEY, SOURCE_ARTIFACT, caTrustConfigMapName, caTrustConfigMapKey, config-file-content, dev-package-managers, input, log-level, ociStorage, ociArtifactExpiresAfter, and sbom-type. Passing an unsupported parameter will cause the pipeline to fail at runtime with an unknown parameter error.
🤖 Prompt for AI Agents
.tekton/lightspeed-stack-pull-request.yaml around lines 230-231: the
prefetch-dependencies task is passing an unsupported parameter "mode" (name:
mode / value: $(params.hermeto-mode)), which will cause the pipeline to fail;
remove the entire "name: mode / value: $(params.hermeto-mode)" mapping from the
task's params block so only supported parameters are passed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (3)
rpms.lock.yaml (1)
550-556: libxcrypt-compat still missing from aarch64 appstream.This architectural inconsistency was already flagged in a previous review. The
libxcrypt-compat-4.4.18-3.el9package is present only in x86_64 appstream but absent from the aarch64 appstream section. Please add the corresponding aarch64 entry for parity.requirements.hermetic.txt (1)
1-9: Outdated build tool versions flagged in previous review.As noted in prior feedback,
patchelf,cmake, andninjaare behind their latest stable releases. Theuv==0.9.21version also appears ahead of the tracked latest (0.9.17 as of December 2025). Consider verifying and updating these before merging.Makefile (1)
117-119: The--universalflag is incompatible with faiss-cpu's platform-specific wheels.Line 118 uses
--universalfor faiss-cpu, but faiss-cpu only distributes platform-specific binary wheels (manylinux x86_64/aarch64). The--universalflag produces cross-platform resolution with environment markers, which conflicts with the platform-locked nature of these wheels. This was flagged in a prior review.Consider removing
--universaland generating separate platform-specific requirements, or verify the current output works correctly for both x86_64 and aarch64 builds.🔎 Proposed fix (remove --universal)
- echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal + echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-depsOr generate per-platform files:
echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.x86_64.txt --generate-hashes --python-version 3.12 --no-deps --python-platform x86_64-unknown-linux-gnu echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.aarch64.txt --generate-hashes --python-version 3.12 --no-deps --python-platform aarch64-unknown-linux-gnu
🧹 Nitpick comments (1)
Containerfile (1)
29-36: Add error handling for googletest build steps.If
tar,cmake, ormakefails, the build continues silently. Consider addingset -eor explicit exit-on-error checks to fail fast on extraction or compilation errors.🔎 Proposed fix
RUN if [ -f /cachi2/cachi2.env ]; then \ + set -e && \ tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ cd /tmp/googletest-release-1.12.1 && \ cmake . && \ make && \ make install; \ fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (7)
- scripts/remove_torch_deps.sh
- artifacts.lock.yaml
- rpms.in.yaml
- requirements.aarch64.txt
- requirements.x86_64.txt
- pyproject.toml
- requirements-build.txt
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
Makefilerequirements.binary.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: library mode / ci
🔇 Additional comments (14)
rpms.lock.yaml (8)
14-41: CMake toolchain additions are consistent.CMake 3.26.5-2.el9 and related packages (cmake-data, cmake-filesystem, cmake-rpm-macros) are added with proper cross-architecture parity. The noarch packages have identical checksums across both architecture sections, which is correct.
Also applies to: 466-493
70-76: Kernel headers updated consistently across architectures.The kernel-headers package is updated to 5.14.0-611.16.1.el9_7 with matching EVR and sourcerpm references for both aarch64 and x86_64.
Also applies to: 529-535
112-125: LLVM additions are consistent and support the Rust toolchain.LLVM filesystem and libs (20.1.8-3.el9) are added consistently for both architectures. These are likely dependencies for the Rust 1.88.0 toolchain being added.
Also applies to: 564-577
413-433: systemd family updates are consistent.The systemd packages (systemd, systemd-pam, systemd-rpm-macros) are updated to 252-55.el9_7.7 with proper cross-architecture consistency. The noarch systemd-rpm-macros has identical checksums across both sections.
Also applies to: 865-885
385-412: Python package updates are consistent.Python 3.9.25-2.el9_7 and related packages are updated consistently across both architectures. The noarch wheel packages (pip-wheel, setuptools-wheel) have identical checksums, and the python-unversioned-command noarch package is also consistent.
Also applies to: 837-864
168-181: binutils updates are consistent.The binutils and binutils-gold packages are updated to 2.35.2-67.el9_7.1 with matching EVR and sourcerpm references for both architectures.
Also applies to: 620-633
49-55: Supporting package additions are consistent.The following packages are added/updated with proper cross-architecture parity:
- emacs-filesystem (noarch) - identical checksums
- libuv (cmake dependency)
- libedit (LLVM/clang dependency)
- vim-filesystem (noarch) - identical checksums
Also applies to: 98-104, 294-300, 448-454, 501-507, 543-549, 746-752, 900-906
7-13: Rust toolchain additions maintain proper cross-architecture parity.Cargo 1.88.0 and Rust 1.88.0 with matching EVR and sourcerpm references across aarch64 and x86_64 are consistent. The packages are available in UBI 9 repositories as confirmed by the cdn-ubi.redhat.com URLs.
Also applies to: 147-160
Containerfile (2)
43-52: LGTM!The two-phase install strategy correctly separates source-built architecture-specific packages (
--no-binary :all:) from prebuilt binary wheels (--no-build). The filename mismatch from prior review has been addressed.
18-19: LGTM!Adding
cmake(for googletest) andcargo(for maturin/Rust-based extensions) is appropriate for the hermetic build requirements..tekton/lightspeed-stack-pull-request.yaml (1)
37-67: LGTM!The multi-entry prefetch-input JSON structure correctly declares rpm, pip (with binary configuration for torch, faiss-cpu, uv), and generic entries. The requirements files align with the new hermetic build artifacts.
.tekton/lightspeed-stack-push.yaml (1)
34-64: LGTM!The prefetch-input structure is consistent with the pull-request pipeline configuration.
Makefile (1)
9-10: LGTM!Adding
FAISS_VERSIONas a variable provides consistent version management aligned with the project's practice of pinning transitive dependencies like faiss-cpu at the top level.requirements.binary.txt (1)
1-19: Autogenerated file with verified platform-specific wheel hashes.The file correctly pins
faiss-cpu==1.13.0with hashes that match PyPI and cover all platform-specific wheels for required architectures: x86_64 (manylinux, musllinux, macOS, Windows) and aarch64 (manylinux, musllinux, macOS, Windows). All 16 hashes are valid and properly generated.
| - name: hermeto-mode | ||
| value: 'permissive' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the Tekton file to confirm the current state
fd lightspeed-stack-pull-request.yaml --type fRepository: lightspeed-core/lightspeed-stack
Length of output: 58
🏁 Script executed:
# Read the file to see lines around 70-71, 230-231, and the task definition at line 239
if [ -f .tekton/lightspeed-stack-pull-request.yaml ]; then
echo "=== Lines 65-75 (hermeto-mode param definition) ===" && \
sed -n '65,75p' .tekton/lightspeed-stack-pull-request.yaml && \
echo "" && \
echo "=== Lines 80-160 (pipelineSpec.params declaration) ===" && \
sed -n '80,160p' .tekton/lightspeed-stack-pull-request.yaml && \
echo "" && \
echo "=== Lines 225-245 (task invocation with mode param) ===" && \
sed -n '225,245p' .tekton/lightspeed-stack-pull-request.yaml
fiRepository: lightspeed-core/lightspeed-stack
Length of output: 3863
🌐 Web query:
prefetch-dependencies-oci-ta Konflux task documentation mode parameter
💡 Result:
Do you mean the Tekton task prefetch-dependencies-oci-ta used by Konflux? If so, can you paste the exact snippet (or link) where you saw a "mode" parameter — I can't find a documented "mode" parameter in the Konflux task docs or task bundle manifests. References I checked: Konflux Trusted Artifacts ADR and the prefetch-dependencies task usage/docs. [1][2]
If you share the YAML or where you saw "mode" I’ll look it up and explain what it does.
Sources: Konflux Trusted Artifacts ADR; Using Trusted Artifacts / build pipeline docs. [1][2]
🏁 Script executed:
# Get full pipelineSpec.params section to see all parameter declarations
if [ -f .tekton/lightspeed-stack-pull-request.yaml ]; then
echo "=== Full pipelineSpec.params section ===" && \
sed -n '/^ params:$/,/^ results:$/p' .tekton/lightspeed-stack-pull-request.yaml | head -100
fiRepository: lightspeed-core/lightspeed-stack
Length of output: 2744
🌐 Web query:
prefetch-dependencies-oci-ta:0.2 task parameters Konflux Tekton catalog
💡 Result:
Here are the known Task parameters (observed for the quay.io/konflux-ci/tekton-catalog/task-prefetch-dependencies-oci-ta:0.2 bundle) and related notes:
- ACTIVATION_KEY. [used for RHEL subscription activation]. [1][2]
- SOURCE_ARTIFACT. [OCI trusted-artifact location to consume]. [1][2]
- caTrustConfigMapKey. [name of CA file key]. [1][2]
- caTrustConfigMapName. [ConfigMap name that holds CA bundle]. [1][2]
- config-file-content. [inline config content]. [1]
- dev-package-managers. [boolean flag enabling dev package managers]. [1]
- input. [JSON describing packages/flags to prefetch]. [1]
- log-level. [e.g. info]. [1]
- ociArtifactExpiresAfter. [expiry for produced OCI artifact]. [1]
- ociStorage. [OCI location where prefetch result is pushed]. [1]
- sbom-type. [SBOM format, e.g. spdx]. [1]
Notes:
- The Konflux docs explain this task is an “-oci-ta” (Trusted Artifacts via OCI) variant and that parameters/results using the _ARTIFACT suffix (e.g., SOURCE_ARTIFACT) are used to pass trusted artifacts between tasks. See ADR and docs for Trusted Artifacts and Konflux task bundles. [2][3]
Sources:
[1] Example invocation and parameter list for prefetch-dependencies (includes the parameter names/values).
[2] Konflux ADR and docs describing Trusted Artifacts and the -oci-ta task variants.
[3] Konflux documentation on building/packaging tasks and using bundles (context on task bundles and parameter usage).
If you want, I can fetch the task YAML from the exact bundle (quay image) and show the full parameter definitions and defaults.
🏁 Script executed:
# Check if there are any other references to hermeto-mode or mode in the file
if [ -f .tekton/lightspeed-stack-pull-request.yaml ]; then
echo "=== All references to 'hermeto' or 'mode' ===" && \
rg -n '(hermeto|mode)' .tekton/lightspeed-stack-pull-request.yaml
fiRepository: lightspeed-core/lightspeed-stack
Length of output: 302
Add hermeto-mode to pipelineSpec.params and remove unsupported mode parameter from task invocation.
The hermeto-mode parameter is defined in spec.params (line 70) but is missing from the formal parameter declaration in pipelineSpec.params. Additionally, the prefetch-dependencies-oci-ta:0.2 task does not support a mode parameter. The task's documented parameters are: ACTIVATION_KEY, SOURCE_ARTIFACT, caTrustConfigMapKey, caTrustConfigMapName, config-file-content, dev-package-managers, input, log-level, ociArtifactExpiresAfter, ociStorage, and sbom-type. Remove the mode parameter from lines 230-231 or clarify the intended purpose if this value should be passed differently.
🤖 Prompt for AI Agents
In .tekton/lightspeed-stack-pull-request.yaml around lines 70 and 230-231, the
pipeline references a hermeto-mode value but the parameter is not declared in
pipelineSpec.params and the task prefetch-dependencies-oci-ta:0.2 does not
accept a mode parameter; add a new pipelineSpec.params entry named hermeto-mode
(with appropriate type and default) at/near line 70 so the pipeline formally
declares it, and remove the unsupported mode parameter from the task invocation
at lines 230-231 (or, if you intended to pass hermeto-mode into a supported task
param, map it to one of the documented task parameters instead and update the
invocation accordingly).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (6)
rpms.lock.yaml (1)
550-556: Architectural inconsistency: libxcrypt-compat missing for aarch64.This issue was previously flagged and remains unresolved. The
libxcrypt-compat-4.4.18-3.el9package is added only for x86_64 but is missing the corresponding aarch64 entry. Sincelibxcrypt-develexists in both architectures (lines 105-111 for aarch64, lines 557-563 for x86_64), the compat variant should also be present in both for consistency..tekton/lightspeed-stack-push.yaml (2)
67-68: Critical: Addhermeto-modetopipelineSpec.params.The
hermeto-modeparameter is defined inspec.paramsbut missing frompipelineSpec.params(lines 77-150). Tekton requires all parameters referenced via$(params.*)to be declared in the pipeline spec. The reference at line 222 will fail to resolve, causing the pipeline to fail at runtime.🔎 Proposed fix
Add to
pipelineSpec.paramsafter line 150 (before theresultssection):+ - name: hermeto-mode + default: 'permissive' + description: Hermetic mode configuration for prefetch-dependencies + type: string results:
221-222: Critical: Remove unsupportedmodeparameter.The
prefetch-dependencies-oci-ta:0.2task does not support amodeparameter. According to Konflux documentation, the supported parameters are: ACTIVATION_KEY, SOURCE_ARTIFACT, caTrustConfigMapName, caTrustConfigMapKey, config-file-content, dev-package-managers, input, log-level, ociStorage, ociArtifactExpiresAfter, and sbom-type. Passing an unsupported parameter will cause the pipeline to fail at runtime with an "unknown parameter" error.🔎 Proposed fix
Remove the unsupported parameter:
- name: ociArtifactExpiresAfter value: $(params.image-expires-after) - - name: mode - value: $(params.hermeto-mode) runAfter:.tekton/lightspeed-stack-pull-request.yaml (2)
70-71: Critical: Addhermeto-modetopipelineSpec.params.The
hermeto-modeparameter is defined inspec.paramsbut missing frompipelineSpec.params(lines 80-157). Without this declaration, the reference to$(params.hermeto-mode)at line 231 will fail to resolve, causing the pipeline to fail at runtime.🔎 Proposed fix
Add to
pipelineSpec.paramsafter line 157 (before theresultssection):+ - name: hermeto-mode + default: 'permissive' + description: Hermetic mode configuration for prefetch-dependencies + type: string results:
230-231: Critical: Remove unsupportedmodeparameter.The
prefetch-dependencies-oci-ta:0.2task does not support amodeparameter. Passing this unsupported parameter will cause the pipeline to fail at runtime with an "unknown parameter" error.🔎 Proposed fix
Remove the unsupported parameter:
- name: ociArtifactExpiresAfter value: $(params.image-expires-after) - - name: mode - value: $(params.hermeto-mode) runAfter:scripts/remove_torch_deps.sh (1)
39-59: Fix awk section exit on blank lines & de‑duplicate section logic.All three awk passes (torch, faiss-cpu, tiktoken) still treat any non‑indented line as “end of section”, which includes blank lines. If a future
uv pip compileever emits a blank line between a package header and its hash block, you’ll prematurely end the section and leave orphaned hash lines. This is the same issue previously flagged for torch/faiss; it’s now replicated for tiktoken as well.You can fix this once and avoid duplication by (a) ignoring empty lines while inside a section and (b) only exiting when the line is non‑empty and not indented. Example for one block:
-# If we are in torch section and line starts with 4 spaces, skip it -in_torch_section == 1 && /^ / { - next # Skip this line -} - -# If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { - in_torch_section = 0 - # Fall through to print this line -} +# If we are in torch section and line starts with 4 spaces, skip it +in_torch_section == 1 && /^ / { + next # Skip this line +} + +# If we are in torch section and line is empty, stay in section but skip it +in_torch_section == 1 && /^$/ { + next # Skip blank line inside section +} + +# If we are in torch section and line is non-empty and not indented, exit section +in_torch_section == 1 && !/^ / && !/^$/ { + in_torch_section = 0 + # Fall through to print this line +}Apply the same pattern to
in_faiss_sectionandin_tiktoken_section, or consider parameterising this logic in a single awk pass keyed by the three package names to avoid three nearly-identical blocks.Also applies to: 62-88, 90-116
🧹 Nitpick comments (2)
scripts/remove_torch_deps.sh (2)
119-123: Three‑stage temp pipeline and cleanup look fine.The staged
mv/rmflow is correct and keeps a backup plus intermediate artifacts minimal. Once the awk logic is fixed, this replacement and cleanup section is good as-is.
21-32: Use more robust pattern to capture full package versions, including suffixes.The current grep pattern
[0-9\.]*drops critical version information. For example,torch==2.7.1+cpuis truncated totorch==2.7.1, losing the+cpusuffix that specifies the CPU-only build. Use[^[:space:]]*instead to capture the full version string.Additionally, add explicit validation to catch silently-failing empty versions:
Suggested fix
-TORCH_VERSION=$(grep -o 'torch==[0-9\.]*' "$INPUT_FILE") -FAISS_VERSION=$(grep -o 'faiss-cpu==[0-9\.]*' "$INPUT_FILE") -TIKTOKEN_VERSION=$(grep -o 'tiktoken==[0-9\.]*' "$INPUT_FILE") +TORCH_VERSION=$(grep -o 'torch==[^[:space:]]*' "$INPUT_FILE" || true) +FAISS_VERSION=$(grep -o 'faiss-cpu==[^[:space:]]*' "$INPUT_FILE" || true) +TIKTOKEN_VERSION=$(grep -o 'tiktoken==[^[:space:]]*' "$INPUT_FILE" || true) + +if [[ -z "$TORCH_VERSION" || -z "$FAISS_VERSION" || -z "$TIKTOKEN_VERSION" ]]; then + echo "Error: torch/faiss-cpu/tiktoken not found in $INPUT_FILE; aborting." + exit 1 +fi
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (7)
- artifacts.lock.yaml
- pyproject.toml
- requirements.hermetic.txt
- Makefile
- requirements.binary.txt
- requirements.aarch64.txt
- requirements-build.txt
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / ci
🔇 Additional comments (8)
rpms.in.yaml (1)
1-1: LGTM! Build tooling packages added correctly.The addition of
cmakeandcargoto the package list aligns with the PR's introduction of new build dependencies. The alphabetical ordering is maintained, which aids readability and maintainability.rpms.lock.yaml (2)
7-41: LGTM! Package additions show good architectural parity.The additions of build tooling packages (cargo, cmake family, rust family) and supporting dependencies (emacs-filesystem, libedit, libuv, llvm family, python-unversioned-command, vim-filesystem) are present in both aarch64 and x86_64 architectures with consistent EVRs. The noarch packages (cmake-data, cmake-rpm-macros, emacs-filesystem, python-unversioned-command, vim-filesystem) correctly share identical checksums across architectures.
Also applies to: 49-55, 98-125, 140-160, 294-300, 385-412, 448-454, 459-493, 501-507, 543-549, 564-577, 592-612, 746-752, 837-864, 900-906
70-76: Package version updates look correct.The EVR updates for kernel-headers (5.14.0-611.16.1.el9_7), binutils family (2.35.2-67.el9_7.1), systemd family (252-55.el9_7.7), and python3 family (3.9.25-2.el9_7) are applied consistently across both architectures with updated checksums and source RPM references.
Also applies to: 168-181, 413-433, 529-535, 620-633, 865-885
.tekton/lightspeed-stack-push.yaml (1)
34-64: LGTM! Well-structured prefetch configuration.The expanded multi-entry JSON array provides clear separation between rpm, pip, and generic dependencies. The pip entry's binary configuration for Python 3.12 with torch, faiss-cpu, and uv packages aligns well with the hermetic build objectives.
.tekton/lightspeed-stack-pull-request.yaml (1)
37-67: LGTM! Consistent prefetch configuration.The multi-entry JSON structure matches the push pipeline and provides proper separation of dependency types. The binary configuration is appropriate for the hermetic build requirements.
Containerfile (2)
18-19: LGTM: Added build tools for Rust and C++ dependencies.The addition of
cmake(for googletest build) andcargo(for maturin-based Rust extensions) is appropriate for the hermetic build requirements.
43-49: > Likely an incorrect or invalid review comment.requirements.x86_64.txt (1)
2087-2178: Pillow bump block looks consistent with uv‑generated hashes.The
pillow==12.1.0entry and its hash set are structurally correct and match the surrounding uv‑generated style; no issues from this change itself.
| # Extract generic dependencies from artifacts.lock.yaml and install them | ||
| RUN if [ -f /cachi2/cachi2.env ]; then \ | ||
| tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \ | ||
| cd /tmp/googletest-release-1.12.1 && \ | ||
| cmake . && \ | ||
| make && \ | ||
| make install; \ | ||
| fi |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
echo "=== Searching for requirements files ==="
find . -type f -name "requirements*.txt" -o -name "pyproject.toml" -o -name "setup.py" -o -name "setup.cfg" 2>/dev/null | head -20
echo -e "\n=== Checking requirements files for googletest/gtest references ==="
fd "requirements.*\.txt$" -t f 2>/dev/null | while read f; do
echo "File: $f"
cat "$f" | head -50
done
echo -e "\n=== Checking pyproject.toml for googletest ==="
if [ -f "pyproject.toml" ]; then
echo "Found pyproject.toml:"
cat pyproject.toml
fi
echo -e "\n=== Checking setup.py for googletest ==="
if [ -f "setup.py" ]; then
echo "Found setup.py:"
cat setup.py | head -100
fi
echo -e "\n=== Searching entire codebase for 'gtest' or 'googletest' ==="
rg -i "gtest|googletest" --type-list 2>/dev/null | head -5
rg -i "gtest|googletest" -g "*.txt" -g "*.py" -g "*.toml" -g "*.cfg" 2>/dev/null || echo "No matches found"
echo -e "\n=== Checking Containerfile context ==="
cat Containerfile 2>/dev/null || echo "Containerfile not found"Repository: lightspeed-core/lightspeed-stack
Length of output: 25105
🏁 Script executed:
# Verify the Containerfile final image to confirm googletest is not propagated
grep -A 20 "^FROM registry.access.redhat.com/ubi9/python-312-minimal" Containerfile | head -25Repository: lightspeed-core/lightspeed-stack
Length of output: 822
Remove the googletest build block—it's not required by any dependencies and won't be included in the final image.
Googletest is not referenced by any Python package in the project and is never used during the build. Additionally, since the final image only copies /app-root from the builder stage, the system-installed googletest is not propagated to the application container anyway. Remove lines 29-36 entirely.
🤖 Prompt for AI Agents
In Containerfile around lines 29 to 36, remove the entire googletest build block
(the RUN conditional that extracts, builds, and installs googletest) because
googletest is unused by any project dependencies and system-installed binaries
from the builder stage are not copied into the final image; delete those lines
so the image build doesn't perform unnecessary work or install unused system
packages.
Signed-off-by: Haoyu Sun <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Fix all issues with AI Agents 🤖
In @.tekton/lightspeed-stack-push.yaml:
- Around line 221-222: Remove the unsupported "mode" parameter from the
prefetch-dependencies task call (the parameter named mode with value
$(params.hermeto-mode)) because the task prefetch-dependencies-oci-ta:0.2 does
not accept it; delete those two lines and, if you still need to pass
hermeto-mode, encode $(params.hermeto-mode) into a supported parameter such as
input (as part of the JSON payload) or config-file-content so the task receives
the value via one of the documented parameters (e.g., input or
config-file-content).
In @Containerfile:
- Around line 18-19: The Dockerfile RUN line installs both cargo and cmake but
cargo doesn't appear referenced in pyproject.toml, requirements.hermetic.txt, or
anywhere mentioning maturin/Rust; confirm whether Rust toolchain/maturin is
required for any build step (search for "maturin", "rust", "cargo", or native
Rust extensions) and if not, remove "cargo" from the RUN instruction to shrink
the builder image; if cargo is required, keep it and add a brief comment above
the RUN line documenting why cargo is needed and update project docs (or add a
reference in requirements.hermetic.txt) so future reviewers can see the
dependency rationale.
♻️ Duplicate comments (6)
scripts/remove_torch_deps.sh (1)
36-61: Blank lines still terminate removal sections for all four packages (torch/faiss-cpu/tiktoken/jiter).The awk logic still treats any non‑indented line as “end of section”. That includes blank lines, so an empty line between
torch==.../faiss-cpu==.../tiktoken==.../jiter==...and their hash lines will prematurely end the section and leave orphaned hash lines in the output. This is the same issue previously flagged for torch/faiss and it’s now replicated for tiktoken/jiter as well.You can fix this once for all four blocks by explicitly ignoring empty lines while in a section:
Suggested awk fix for all four sections
# If we are in torch section and line starts with 4 spaces, skip it in_torch_section == 1 && /^ / { next # Skip this line } -# If we are in torch section and line does NOT start with 4 spaces, exit torch section -in_torch_section == 1 && !/^ / { +# If we are in torch section and line is empty, skip it +in_torch_section == 1 && /^$/ { + next # Skip empty line +} + +# If we are in torch section and line does NOT start with 4 spaces, exit torch section +in_torch_section == 1 && !/^ / && !/^$/ { in_torch_section = 0 # Fall through to print this line }Apply the same pattern to
in_faiss_section,in_tiktoken_section, andin_jiter_section.Also applies to: 64-91, 93-120, 122-149
Containerfile (1)
29-36: Reconsider the googletest build block—it may be unnecessary.The googletest build installs to system locations in the builder stage, but the final image only copies
/app-root(line 76), so the installed googletest binaries won't be present in the final container. Additionally, googletest is a C++ testing framework that doesn't appear to be required by any Python dependencies in this project.If googletest is genuinely needed (e.g., for native extension tests or build-time validation), please document the requirement. Otherwise, consider removing lines 29-36 to reduce build complexity and time.
🔎 Alternative: If googletest is needed only for build-time tests
If you need googletest for building/testing native extensions but don't need it in the final image, the current approach is acceptable. However, add a comment explaining why:
# Extract generic dependencies from artifacts.lock.yaml and install them +# googletest is used for building/testing native C++ extensions during the build RUN if [ -f /cachi2/cachi2.env ]; then \.tekton/lightspeed-stack-push.yaml (1)
67-68: Critical:hermeto-modemust be declared inpipelineSpec.paramsto resolve properly.The
hermeto-modeparameter is defined inspec.params(lines 67-68) and referenced at line 222 as$(params.hermeto-mode), but it is not declared inpipelineSpec.params(lines 77-153). Tekton requires parameters to be declared in the pipeline spec for$(params.*)references to work correctly. Without this declaration, the pipeline will fail at runtime with an undefined parameter error.🔎 Proposed fix
Add the following after line 153 (after the
build-platformsparameter, before theresultssection):+ - name: hermeto-mode + default: 'permissive' + description: Hermetic mode configuration for prefetch-dependencies + type: string results:.tekton/lightspeed-stack-pull-request.yaml (2)
70-71: Critical:hermeto-modemust be declared inpipelineSpec.params.The
hermeto-modeparameter is defined inspec.params(lines 70-71) and referenced at line 231 as$(params.hermeto-mode), but it is not declared inpipelineSpec.params(lines 80-157). This will cause the pipeline to fail at runtime when Tekton attempts to resolve the parameter reference.🔎 Proposed fix
Add the following after line 157 (after the
enable-cache-proxyparameter, before theresultssection):+ - name: hermeto-mode + default: 'permissive' + description: Hermetic mode configuration for prefetch-dependencies + type: string results:
230-231: Critical: Remove unsupportedmodeparameter from prefetch-dependencies task.The
modeparameter is not supported byprefetch-dependencies-oci-ta:0.2. The task accepts:ACTIVATION_KEY,SOURCE_ARTIFACT,caTrustConfigMapName,caTrustConfigMapKey,config-file-content,dev-package-managers,input,log-level,ociStorage,ociArtifactExpiresAfter, andsbom-type. This unsupported parameter will cause a runtime failure.🔎 Proposed fix
Remove lines 230-231:
- name: ociArtifactExpiresAfter value: $(params.image-expires-after) - - name: mode - value: $(params.hermeto-mode) runAfter:rpms.lock.yaml (1)
105-111: Major: Addlibxcrypt-compatto aarch64 appstream for architectural parity.The x86_64 appstream section includes
libxcrypt-compat-4.4.18-3.el9(lines 550-556), but the aarch64 appstream section only haslibxcrypt-develwithout the corresponding compat package. This architectural inconsistency can cause runtime compatibility issues on aarch64 systems that expect the legacy crypt API provided by libxcrypt-compat.🔎 Proposed fix
Add the following entry after line 111 in the aarch64 appstream section:
- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm repoid: ubi-9-for-aarch64-appstream-rpms size: <size for aarch64> checksum: sha256:<checksum for aarch64> name: libxcrypt-compat evr: 4.4.18-3.el9 sourcerpm: libxcrypt-4.4.18-3.el9.src.rpmRegenerate the lockfile to populate the correct size and checksum values for the aarch64 package.
🧹 Nitpick comments (2)
scripts/remove_torch_deps.sh (2)
21-35: Make version extraction & compile steps more robust.The
grep -o '...==[0-9\.]*'approach is brittle:
- It silently fails if the line is missing (empty version strings then get piped into
uv pip compile).- It truncates any non‑numeric version suffixes (e.g. pre/post releases).
Consider tightening this:
- Fail fast if any of the four versions can’t be found.
- Use a more general pattern for the version token.
For example:
Possible hardening
-TORCH_VERSION=$(grep -o 'torch==[0-9\.]*' "$INPUT_FILE") -FAISS_VERSION=$(grep -o 'faiss-cpu==[0-9\.]*' "$INPUT_FILE") -TIKTOKEN_VERSION=$(grep -o 'tiktoken==[0-9\.]*' "$INPUT_FILE") -JITER_VERSION=$(grep -o 'jiter==[0-9\.]*' "$INPUT_FILE") +TORCH_VERSION=$(grep -oE 'torch==[^[:space:]]+' "$INPUT_FILE" || true) +FAISS_VERSION=$(grep -oE 'faiss-cpu==[^[:space:]]+' "$INPUT_FILE" || true) +TIKTOKEN_VERSION=$(grep -oE 'tiktoken==[^[:space:]]+' "$INPUT_FILE" || true) +JITER_VERSION=$(grep -oE 'jiter==[^[:space:]]+' "$INPUT_FILE" || true) + +if [[ -z "$TORCH_VERSION" || -z "$FAISS_VERSION" || -z "$TIKTOKEN_VERSION" || -z "$JITER_VERSION" ]]; then + echo "Error: could not find all required versions in $INPUT_FILE (torch/faiss-cpu/tiktoken/jiter)" >&2 + exit 1 +fiThis makes failures explicit and preserves full version specifiers.
3-5: Update script description and success message to cover all removed deps.The script now also handles
faiss-cpu,tiktoken, andjiter, but:
- The header comment still says “remove torch dependencies”.
- The final echo omits
jiter.Recommend updating both to reflect the full scope so future readers aren’t surprised about what this script is changing.
Also applies to: 155-155
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (14)
.tekton/lightspeed-stack-pull-request.yaml.tekton/lightspeed-stack-push.yamlContainerfileMakefileartifacts.lock.yamlpyproject.tomlrequirements-build.txtrequirements.aarch64.txtrequirements.binary.txtrequirements.hermetic.txtrequirements.x86_64.txtrpms.in.yamlrpms.lock.yamlscripts/remove_torch_deps.sh
🚧 Files skipped from review as they are similar to previous changes (5)
- Makefile
- requirements.hermetic.txt
- rpms.in.yaml
- pyproject.toml
- requirements-build.txt
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfile
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.x86_64.txtrequirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: server mode / azure
- GitHub Check: E2E: library mode / vertexai
- GitHub Check: E2E: server mode / vertexai
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: library mode / azure
🔇 Additional comments (7)
artifacts.lock.yaml (1)
1-7: Googletest artifact lock entry looks consistentSchema is minimal but coherent (versioned metadata and a single artifact with matching download_url/filename), which should work fine with the cachi2/hermetic prefetch flow as described.
requirements.aarch64.txt (1)
1983-2074: Pillow 12.1.0 bump + hash refresh: confirm cross‑arch consistency and testsThe Pillow upgrade and regenerated hashes look fine here; please double‑check that:
- x86_64 and any other compiled requirement sets are regenerated to the same Pillow version, and
- image‑related code paths (e.g., matplotlib, any Pillow‑using services) have been exercised in CI with this bump.
requirements.binary.txt (1)
3-179: Binary pinning for faiss‑cpu, jiter, tiktoken looks good; ensure it’s the single source of truthIsolating
faiss-cpu==1.13.2,jiter==0.12.0, andtiktoken==0.12.0intorequirements.binary.txtwith full hash coverage is a sane way to drive hermetic prefetch of heavyweight binary wheels. Please verify that:
- These three packages are no longer pinned in the regular arch‑specific requirement files (only here), to avoid accidental double installation.
- Their versions remain aligned with whatever is declared in
pyproject.toml/Makefile or other manifests, so future bumps only need a single regeneration of this file.- The Tekton/cachi2 pipeline paths that previously relied on the main requirements files for these binaries now exclusively point at
requirements.binary.txt.Based on learnings, this keeps the intended “faiss‑cpu as a top‑level pinned dep” pattern while centralizing binaries in one place.
requirements.x86_64.txt (1)
1983-2074: Pillow 12.1.0 bump: please validate against your runtime matrix.The version jump for
pillow(12.0.0 → 12.1.0) looks fine here, but it’s in the hot path for image handling. Please re-run your image‑heavy tests (and any torch/transformers flows that rely on PIL) across your supported Python/OS matrix to confirm there are no regressions or ABI surprises. Also ensure any binary/hermetic manifests remain consistent with this version..tekton/lightspeed-stack-push.yaml (1)
34-64: Well-structured prefetch-input configuration.The expanded multi-entry JSON structure is well-formed and properly includes the new requirements files (
requirements.binary.txt,requirements-build.txt) along with the binary configuration specifying architectures and packages. The structure aligns with the hermetic build objectives..tekton/lightspeed-stack-pull-request.yaml (1)
36-67: Well-structured prefetch-input configuration.The expanded JSON structure mirrors the push pipeline configuration and is properly formatted with all necessary entries for hermetic builds including the binary packages configuration.
Containerfile (1)
43-52: Hermetic build logic is well-structured.The conditional build approach properly handles both hermetic (cachi2) and non-hermetic scenarios:
- Hermetic path: Creates venv from prefetched packages, installs arch-specific requirements from source, then installs binary wheels
- Non-hermetic path: Uses uv sync with locked dependencies
The separation of
--no-binary :all:for arch-specific packages and--no-buildfor torch/binary packages is appropriate for a hermetic build strategy. All referenced files (requirements.binary.txt,requirements.torch.txt, and architecture-specific requirement files) are present in the repository.
| - name: mode | ||
| value: $(params.hermeto-mode) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Remove unsupported mode parameter from prefetch-dependencies task.
The mode parameter (lines 221-222) is not supported by the prefetch-dependencies-oci-ta:0.2 task. According to Konflux task documentation, the supported parameters are: ACTIVATION_KEY, SOURCE_ARTIFACT, caTrustConfigMapName, caTrustConfigMapKey, config-file-content, dev-package-managers, input, log-level, ociStorage, ociArtifactExpiresAfter, and sbom-type. Passing an unsupported parameter will cause the pipeline to fail at runtime with an "unknown parameter" error.
🔎 Proposed fix
Remove lines 221-222:
- name: ociArtifactExpiresAfter
value: $(params.image-expires-after)
- - name: mode
- value: $(params.hermeto-mode)
runAfter:If the hermeto-mode value needs to be passed to the prefetch task, it should be encoded within one of the supported parameters (e.g., as part of the input JSON structure or config-file-content).
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - name: mode | |
| value: $(params.hermeto-mode) | |
| - name: ociArtifactExpiresAfter | |
| value: $(params.image-expires-after) | |
| runAfter: |
🤖 Prompt for AI Agents
In @.tekton/lightspeed-stack-push.yaml around lines 221-222, Remove the
unsupported "mode" parameter from the prefetch-dependencies task call (the
parameter named mode with value $(params.hermeto-mode)) because the task
prefetch-dependencies-oci-ta:0.2 does not accept it; delete those two lines and,
if you still need to pass hermeto-mode, encode $(params.hermeto-mode) into a
supported parameter such as input (as part of the JSON payload) or
config-file-content so the task receives the value via one of the documented
parameters (e.g., input or config-file-content).
| # cargo for maturin requirements | ||
| RUN dnf install -y --nodocs --setopt=keepcache=0 --setopt=tsflags=nodocs gcc cmake cargo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if any Python packages in requirements files need cargo/cmake
echo "=== Checking requirements files for packages that need cargo/cmake ==="
for req in requirements*.txt; do
if [ -f "$req" ]; then
echo "File: $req"
cat "$req" | head -30
fi
done
echo -e "\n=== Checking pyproject.toml for maturin-based packages ==="
if [ -f pyproject.toml ]; then
grep -i "maturin\|cargo\|rust" pyproject.toml || echo "No maturin/cargo/rust references"
fi
echo -e "\n=== Checking for cmake requirements ==="
rg -i "cmake" pyproject.toml requirements*.txt 2>/dev/null || echo "No cmake references found"
echo -e "\n=== Listing all requirements files ==="
find . -maxdepth 2 -name "requirements*.txt" -o -name "pyproject.toml" | head -20Repository: lightspeed-core/lightspeed-stack
Length of output: 10267
Verify that cargo is required; cmake is confirmed needed.
cmake appears justified—it's listed in requirements.hermetic.txt as a required build tool. However, cargo does not appear in any Python dependencies or pyproject.toml. There are no references to maturin, Rust, or cargo in the project dependencies. Confirm whether cargo is genuinely needed or remove it to reduce builder image size.
🤖 Prompt for AI Agents
In @Containerfile around lines 18-19, The Dockerfile RUN line installs both
cargo and cmake but cargo doesn't appear referenced in pyproject.toml,
requirements.hermetic.txt, or anywhere mentioning maturin/Rust; confirm whether
Rust toolchain/maturin is required for any build step (search for "maturin",
"rust", "cargo", or native Rust extensions) and if not, remove "cargo" from the
RUN instruction to shrink the builder image; if cargo is required, keep it and
add a brief comment above the RUN line documenting why cargo is needed and
update project docs (or add a reference in requirements.hermetic.txt) so future
reviewers can see the dependency rationale.
Description
do not merge
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit
New Features
Chores
Fixes
✏️ Tip: You can customize this high-level summary in your review settings.