12 feature software disripution via docker#22
Conversation
…alog and tool ecosystem - Added a comprehensive document outlining the new software distribution strategy using digest-pinned, cosign-verified OCI images hosted on GitHub Container Registry. - Established a clear separation between binary releases and catalog updates to enhance release cadence and user experience. - Defined goals and non-goals for the new system, emphasizing durability, trust, and transparency in tool updates. - Included architectural considerations inspired by Nanoclaw for improved security and isolation in container management. - Outlined repository layout and image hosting strategies to ensure robust and reliable tool distribution.
…d tool catalog structure - Deleted the outdated `docker_software_distribution.md` file to streamline documentation. - Introduced a new `catalog/README.md` to outline the structure and functionality of the Pengine Tool Catalog. - Added `catalog/entries/pengine-file-manager.json` to define the first tool in the catalog with its metadata. - Created `catalog/v1/tools.json` to aggregate tool information and ensure versioning and digest references for tools. - Established GitHub Actions workflows for publishing tool images and regenerating the tool catalog upon changes.
…isripution-via-docker Made-with: Cursor # Conflicts: # src-tauri/src/modules/mcp/native.rs # src/modules/toolengine/components/ToolEnginePanel.tsx
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughThis pull request extends the tool catalog system with versioning support, adds GitHub Actions automation for publishing tool container images to GHCR with cosign signing, implements streaming logs during tool installation/uninstallment, refactors image resolution to use OCI digest references, updates JSON-RPC protocol to handle flexible ID types, and enhances the frontend to display installation progress. Changes
Estimated Code Review Effort🎯 4 (Complex) | ⏱️ ~50 minutes The changes involve substantial schema migration across types and manifests, dense image resolution logic with digest handling, multi-component logging integration with asynchronous callbacks, JSON-RPC protocol flexibility, and new GitHub Actions automation. The interdependencies between the catalog schema, service logic, and UI require careful semantic verification. Possibly Related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (3)
catalog/README.md (1)
7-15: Add language specifier to fenced code block.The code block should have a language identifier for consistency and to satisfy markdownlint rules.
Suggested fix
-``` +```text catalog/ ├── entries/ # Source of truth: one JSON file per tool │ └── pengine-file-manager.json ├── v1/ │ ├── tools.json # Aggregated catalog (generated by CI from entries/) │ └── tools.json.sig # Cosign detached signature └── README.md -``` +```🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@catalog/README.md` around lines 7 - 15, The fenced code block containing the repository tree in README.md lacks a language specifier; update the opening fence from ``` to ```text so the block becomes a "text" fenced code block (locate the tree snippet starting with "catalog/" in README.md and change the fence only).src-tauri/src/modules/tool_engine/service.rs (1)
246-256: Consider usingVecDequefor more efficient tail buffering.The current implementation uses
Vec::remove(0)which is O(n). For 50 elements this is negligible, butVecDequewithpop_front()would be O(1).Suggested optimization
+use std::collections::VecDeque; async fn run_streaming_tagged( mut cmd: tokio::process::Command, log: &LogFn, tag: &str, ) -> Result<(), String> { cmd.stdout(Stdio::null()).stderr(Stdio::piped()); let mut child = cmd.spawn().map_err(|e| format!("failed to spawn: {e}"))?; let stderr = child.stderr.take(); - let mut stderr_tail: Vec<String> = Vec::new(); + let mut stderr_tail: VecDeque<String> = VecDeque::with_capacity(50); if let Some(se) = stderr { let mut lines = BufReader::new(se).lines(); while let Ok(Some(line)) = lines.next_line().await { log(&format!("{tag} {line}")); - stderr_tail.push(line); + stderr_tail.push_back(line); if stderr_tail.len() > 50 { - stderr_tail.remove(0); + stderr_tail.pop_front(); } } } // ... rest unchanged, but join would need: - let tail = stderr_tail.join("\n"); + let tail = stderr_tail.iter().collect::<Vec<_>>().join("\n");🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/modules/tool_engine/service.rs` around lines 246 - 256, The stderr tail buffer currently uses a Vec named stderr_tail and removes the first element with stderr_tail.remove(0), which is O(n); change stderr_tail's type to std::collections::VecDeque<String>, import VecDeque, and replace push/trim logic: push_back(line) and when the deque.len() > 50 call pop_front() to trim; update the initialization of stderr_tail and any other usages of stderr_tail (e.g., where it was previously treated as Vec) so the reading loop in the BufReader lines handling uses push_back and pop_front for O(1) tail maintenance..github/workflows/catalog-publish.yml (1)
80-90: Consider handling concurrent push failures.If multiple catalog entry changes are merged in quick succession, the
git pushon line 89 could fail due to a non-fast-forward update. The workflow doesn't retry or rebase in this case.Suggested fix with retry logic
- name: Commit updated catalog run: | git config user.name "github-actions[bot]" git config user.email "github-actions[bot]@users.noreply.github.com" git add catalog/v1/tools.json catalog/v1/tools.json.sig if git diff --cached --quiet; then echo "No catalog changes to commit" else git commit -m "chore(catalog): regenerate tools.json (revision $(python3 -c "import json; print(json.load(open('catalog/v1/tools.json'))['catalog_revision'])"))" - git push + # Retry push with rebase in case of concurrent updates + for i in 1 2 3; do + git push && break + git pull --rebase + done fi🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/catalog-publish.yml around lines 80 - 90, The workflow's "Commit updated catalog" step currently does a single git push which can fail on non-fast-forward when concurrent updates happen; change that step to retry the push with a simple loop: after committing, attempt git push and if it fails due to non-fast-forward, run git fetch && git pull --rebase (or git pull --rebase origin $GITHUB_REF) to replay the local commit, then retry push up to a small max attempts (e.g., 3); ensure the commit message generation (the python snippet used for the commit - the catalog_revision extraction) is preserved and that the script exits non-zero if all retries fail so the workflow marks the job as failed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/tools-publish.yml:
- Around line 65-71: The catalog-to-tool slug mapping is wrong: update the
catalog_slugs extraction so it removes the "pengine-" prefix after stripping the
path and extension (i.e., modify the pipeline that builds catalog_slugs from
changed to also run a transform like removing ^pengine-), ensuring the resulting
slug names match the directory names in the tools/ tree; update the
catalog_slugs variable construction (used later when building slugs and checking
for tools/$slug/$toolfile) so catalog entries like
catalog/entries/pengine-file-manager.json produce "file-manager" to match
tools/file-manager/.
---
Nitpick comments:
In @.github/workflows/catalog-publish.yml:
- Around line 80-90: The workflow's "Commit updated catalog" step currently does
a single git push which can fail on non-fast-forward when concurrent updates
happen; change that step to retry the push with a simple loop: after committing,
attempt git push and if it fails due to non-fast-forward, run git fetch && git
pull --rebase (or git pull --rebase origin $GITHUB_REF) to replay the local
commit, then retry push up to a small max attempts (e.g., 3); ensure the commit
message generation (the python snippet used for the commit - the
catalog_revision extraction) is preserved and that the script exits non-zero if
all retries fail so the workflow marks the job as failed.
In `@catalog/README.md`:
- Around line 7-15: The fenced code block containing the repository tree in
README.md lacks a language specifier; update the opening fence from ``` to
```text so the block becomes a "text" fenced code block (locate the tree snippet
starting with "catalog/" in README.md and change the fence only).
In `@src-tauri/src/modules/tool_engine/service.rs`:
- Around line 246-256: The stderr tail buffer currently uses a Vec named
stderr_tail and removes the first element with stderr_tail.remove(0), which is
O(n); change stderr_tail's type to std::collections::VecDeque<String>, import
VecDeque, and replace push/trim logic: push_back(line) and when the deque.len()
> 50 call pop_front() to trim; update the initialization of stderr_tail and any
other usages of stderr_tail (e.g., where it was previously treated as Vec) so
the reading loop in the BufReader lines handling uses push_back and pop_front
for O(1) tail maintenance.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c4f11096-5f3d-41d9-85cc-b7818aafd7f7
⛔ Files ignored due to path filters (1)
tools/file-manager/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (22)
.github/workflows/catalog-publish.yml.github/workflows/tools-publish.ymlcatalog/README.mdcatalog/entries/pengine-file-manager.jsoncatalog/v1/tools.jsonsrc-tauri/.gitignoresrc-tauri/src/infrastructure/http_server.rssrc-tauri/src/modules/mcp/native.rssrc-tauri/src/modules/mcp/protocol.rssrc-tauri/src/modules/mcp/registry.rssrc-tauri/src/modules/mcp/service.rssrc-tauri/src/modules/mcp/transport.rssrc-tauri/src/modules/mcp/types.rssrc-tauri/src/modules/tool_engine/container/file-manager/buildsrc-tauri/src/modules/tool_engine/service.rssrc-tauri/src/modules/tool_engine/tools.jsonsrc-tauri/src/modules/tool_engine/types.rssrc/modules/toolengine/components/ToolEnginePanel.tsxtools/file-manager/Dockerfiletools/file-manager/package.jsontools/file-manager/pengine-tool.jsonvite.config.ts
💤 Files with no reviewable changes (4)
- src-tauri/.gitignore
- vite.config.ts
- src-tauri/src/modules/tool_engine/container/file-manager/build
- src-tauri/src/modules/mcp/registry.rs
- Enhanced `catalog/README.md` to include information on manual image publishing and updated instructions for adding new tools. - Introduced `doc/tool-engine/manual-publish.md` detailing the process for building and pushing tool images manually, including prerequisites and multi-architecture support.
There was a problem hiding this comment.
🧹 Nitpick comments (2)
catalog/README.md (2)
23-23: Consider breaking this dense bullet point into sub-bullets.Line 23 covers four distinct concepts (OCI image building, git diff detection, version tag publishing, and adding new tools) in a single, long bullet point. Breaking it into nested bullets would improve readability and scannability.
♻️ Proposed refactor for improved readability
-- **OCI images** are built by [`.github/workflows/tools-publish.yml`](../.github/workflows/tools-publish.yml). On each push to `main`, only tools whose `catalog/entries/` or `tools/<slug>/` files changed are rebuilt (detected via `git diff`). Version tags like `file-manager-v0.2.0` publish that tool only. To add a new tool, create a `tools/<slug>/` directory with a `pengine-tool.json` manifest and a `Dockerfile`. +- **OCI images** are built by [`.github/workflows/tools-publish.yml`](../.github/workflows/tools-publish.yml): + - On each push to `main`, only tools whose `catalog/entries/` or `tools/<slug>/` files changed are rebuilt (detected via `git diff`). + - Version tags like `file-manager-v0.2.0` publish that tool only. + - To add a new tool, create a `tools/<slug>/` directory with a `pengine-tool.json` manifest and a `Dockerfile`.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@catalog/README.md` at line 23, The single long bullet in README.md should be split into multiple sub-bullets for clarity: create one bullet explaining that OCI images are built by .github/workflows/tools-publish.yml, a second bullet describing the git-diff detection rule that only rebuilds tools when files under catalog/entries/ or tools/<slug>/ change, a third bullet noting how version tags (e.g., file-manager-v0.2.0) publish a specific tool, and a fourth bullet describing how to add a new tool (create tools/<slug>/ with pengine-tool.json and Dockerfile); update the README.md bullet to use nested bullets so each concept (workflow, detection rule, version tags, add-new-tool steps) is its own short line for readability.
7-15: Consider adding a language identifier to the fenced code block.The directory structure visualization uses a fenced code block without a language specifier. While not critical for a tree structure, adding
textor leaving it unspecified with triple backticks followed by the language improves markdown linting compliance.📝 Proposed fix
-``` +```text catalog/ ├── entries/ # Source of truth: one JSON file per tool │ └── pengine-file-manager.json🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@catalog/README.md` around lines 7 - 15, Update the fenced code block in README.md to include a language identifier (e.g., add "text" after the opening ```), so the tree diagram block becomes ```text ... ```, which fixes markdown linting; locate the block showing the catalog/ directory structure in catalog/README.md and add the language token to the opening fence.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@catalog/README.md`:
- Line 23: The single long bullet in README.md should be split into multiple
sub-bullets for clarity: create one bullet explaining that OCI images are built
by .github/workflows/tools-publish.yml, a second bullet describing the git-diff
detection rule that only rebuilds tools when files under catalog/entries/ or
tools/<slug>/ change, a third bullet noting how version tags (e.g.,
file-manager-v0.2.0) publish a specific tool, and a fourth bullet describing how
to add a new tool (create tools/<slug>/ with pengine-tool.json and Dockerfile);
update the README.md bullet to use nested bullets so each concept (workflow,
detection rule, version tags, add-new-tool steps) is its own short line for
readability.
- Around line 7-15: Update the fenced code block in README.md to include a
language identifier (e.g., add "text" after the opening ```), so the tree
diagram block becomes ```text ... ```, which fixes markdown linting; locate the
block showing the catalog/ directory structure in catalog/README.md and add the
language token to the opening fence.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2bca2288-9478-4187-af28-5ab96962939c
📒 Files selected for processing (2)
catalog/README.mddoc/tool-engine/manual-publish.md
- Removed the `mcp_server_cmd` array from multiple tool configuration files, setting it to an empty array to prevent issues with server command execution. - Updated documentation to reflect changes in command handling for tools, ensuring clarity on the expected configuration for successful tool operation.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
doc/tool-engine/manual-publish.md (1)
107-108: Explicitly use--allon manifest push for clarity and official best practices.While
podman manifest pushdefaults to pushing all referenced images in the manifest list (--alldefaults totrue), the official Podman documentation explicitly recommends using the--allflag when pushing multi-arch manifests. This makes the intent clear and prevents confusion about whether all platform images are being pushed, aligning with official guidance to avoid incomplete pushes.Suggested command update
-podman manifest push "$MANIFEST" "docker://${IMAGE}:${VERSION}" -podman manifest push "$MANIFEST" "docker://${IMAGE}:latest" +podman manifest push --all "$MANIFEST" "docker://${IMAGE}:${VERSION}" +podman manifest push --all "$MANIFEST" "docker://${IMAGE}:latest"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@doc/tool-engine/manual-publish.md` around lines 107 - 108, The podman manifest push commands (podman manifest push "$MANIFEST" "docker://${IMAGE}:${VERSION}" and podman manifest push "$MANIFEST" "docker://${IMAGE}:latest") should explicitly include the --all flag; update both uses of podman manifest push to add --all so the intent to push all referenced platform images is explicit and matches Podman best practices.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@doc/tool-engine/manual-publish.md`:
- Around line 83-88: Reword the paragraph around the podman untag snippet to
clarify that podman untag only removes local tags (it does not delete or modify
remote images on GHCR); mention that the commands `podman untag
"${IMAGE}:${VERSION}"` and `podman untag "${IMAGE}:latest"` operate on the local
daemon/manifest store and that removal from the remote registry must be done via
the registry UI/API or `crane`/`gh`/registry-specific tooling if desired. Ensure
the text references `podman untag`, `${IMAGE}`, `${VERSION}`, and GHCR so
readers understand the local vs remote scope.
---
Nitpick comments:
In `@doc/tool-engine/manual-publish.md`:
- Around line 107-108: The podman manifest push commands (podman manifest push
"$MANIFEST" "docker://${IMAGE}:${VERSION}" and podman manifest push "$MANIFEST"
"docker://${IMAGE}:latest") should explicitly include the --all flag; update
both uses of podman manifest push to add --all so the intent to push all
referenced platform images is explicit and matches Podman best practices.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 45021722-4179-477b-8857-1a6caf17cad6
📒 Files selected for processing (6)
catalog/entries/pengine-file-manager.jsoncatalog/v1/tools.jsondoc/tool-engine/manual-publish.mdsrc-tauri/src/modules/mcp/service.rssrc-tauri/src/modules/tool_engine/tools.jsontools/file-manager/pengine-tool.json
✅ Files skipped from review due to trivial changes (4)
- tools/file-manager/pengine-tool.json
- catalog/v1/tools.json
- catalog/entries/pengine-file-manager.json
- src-tauri/src/modules/mcp/service.rs
🚧 Files skipped from review as they are similar to previous changes (1)
- src-tauri/src/modules/tool_engine/tools.json
- Updated `.github/workflows/tools-publish.yml` to read upstream MCP npm package and version from the catalog, allowing dynamic installation during image builds. - Modified `catalog/entries/pengine-file-manager.json` and `catalog/v1/tools.json` to include `upstream_mcp_npm` metadata for the file manager tool. - Adjusted `tools/file-manager/Dockerfile` to utilize build arguments for npm package installation, ensuring consistency with catalog entries. - Enhanced documentation in `catalog/README.md` and `doc/tool-engine/manual-publish.md` to reflect changes in tool addition and image publishing processes.
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (1)
doc/tool-engine/manual-publish.md (1)
84-89:⚠️ Potential issue | 🟡 MinorClarify that
podman untagonly changes the local image store.This still reads like the commands free the tag in GHCR.
podman untag "${IMAGE}:${VERSION}"andpodman untag "${IMAGE}:latest"only remove local references; remote cleanup is separate.Suggested wording
-If **`${IMAGE}:${VERSION}`** was already used for a single-arch image, free the name before creating a manifest list: +If your **local** Podman image store already has `${IMAGE}:${VERSION}` or `${IMAGE}:latest` tagged as a single-arch image, remove those local tags before creating the manifest list:Does `podman untag IMAGE:TAG` remove tags from a remote registry like GHCR, or only from the local Podman image store?🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@doc/tool-engine/manual-publish.md` around lines 84 - 89, The existing text implies podman untag frees the tag in GHCR; update the sentence around the podman untag example (the lines with podman untag "${IMAGE}:${VERSION}" and podman untag "${IMAGE}:latest") to explicitly state these commands only remove local references from the Podman image store and do not affect tags or manifests in remote registries like GHCR, and add a short note that remote cleanup must be performed separately (e.g., via registry UI or registry-specific API/CLI).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/tools-publish.yml:
- Around line 170-172: The workflow currently always adds the :latest tag
alongside the versioned tag (using steps.cfg.outputs.image and
steps.version.outputs.version), which causes historical tag publishes to retag
latest; fix by computing the list of tags in a dedicated step (e.g., create a
"compute-tags" step) that emits a tags output and only includes "${{
steps.cfg.outputs.image }}:latest" when the publish is not for a git tag or when
the published version matches the repo/catalog current version (use the existing
tag value from steps.version.outputs.version and the catalog/current output if
available to decide); then use that computed tags output in the job instead of
unconditionally appending :latest.
- Around line 60-71: The workflow currently only validates that a tag maps to a
known slug; add a guard that verifies the tag actually points at main by
fetching the remote main branch and comparing the tag commit SHA to origin/main
(or testing whether the tag commit is an ancestor/equal to origin/main) before
proceeding; locate the tag/slug logic (variables tag and slug in the elif
branch) and, after setting slug and before setting slugs="$slug", fetch origin
main (git fetch origin main --depth=1) and compare commits (e.g., get tag_sha
for "$tag" and main_sha for origin/main or use git merge-base --is-ancestor) and
if the tag does not point at main echo the existing skip output and exit 0.
- Around line 147-156: Currently the workflow silently falls back to
PKG_DEFAULT/VER_DEFAULT when the catalog lacks upstream_mcp_npm, which can cause
wrong publishes; update the block that reads CATALOG (variables PKG_DEFAULT,
VER_DEFAULT, PKG, VER and the jq calls) to instead treat a missing or empty
(.upstream_mcp_npm | .package or .version) as a metadata error: after running
the jq extraction, if either PKG or VER is empty (and CATALOG exists), write a
clear error to stderr including the CATALOG path and exit 1 (do not assign
defaults), otherwise proceed using the extracted PKG and VER.
In `@catalog/README.md`:
- Around line 7-15: Update the fenced code block that contains the directory
tree starting with "catalog/" to specify a language by changing the opening
fence from ``` to ```text so markdownlint stops flagging it; locate the tree
block in README.md (the block showing "catalog/ ├── entries/ ...") and prepend
"text" to the opening backticks.
In `@src-tauri/src/modules/tool_engine/service.rs`:
- Around line 377-384: The uninstall currently resolves the image from the
latest catalog entry (via image_reference(entry)) which may differ from the
image actually installed; instead, read the installed image ref from the stored
ServerEntry::Stdio.args in the catalog/installed metadata (the entry that
corresponds to the given tool_id) before removing the server entry and use that
value when invoking runtime.binary args ["rmi", <image_ref>] to delete the
actual local image; only fall back to image_reference(entry) if no installed
ServerEntry::Stdio.args image ref is present. Ensure you locate this logic
around load_catalog, catalog.tools iteration and the
Command::new(&runtime.binary) rmi invocation.
In `@src-tauri/src/modules/tool_engine/tools.json`:
- Around line 14-23: The versions entry in tools.json contains "digest":
"sha256:placeholder" which causes service.rs to treat it as a tag instead of a
pinned image; replace that placeholder with the real SHA-256 digest for the ghcr
image that corresponds to version "0.1.0" (format "sha256:<hex>") so the tool is
resolved by digest, or if you can't provide the digest yet remove/hold this
version entry; update the "digest" field only (leave "version" and "released_at"
intact) and ensure the hex is the full 64-character lowercase SHA-256 string.
- Line 18: The released_at timestamp in the tools.json entry is later than the
catalog snapshot date, causing an internal history inconsistency; update the
"released_at" field in the tools.json entry (the released_at property) to be on
or before the catalog snapshot date (e.g., change "2026-04-15T00:00:00Z" to
"2026-04-12T00:00:00Z" or another appropriate date ≤ snapshot) or alternatively
update the catalog snapshot date to match if the release date is authoritative.
---
Duplicate comments:
In `@doc/tool-engine/manual-publish.md`:
- Around line 84-89: The existing text implies podman untag frees the tag in
GHCR; update the sentence around the podman untag example (the lines with podman
untag "${IMAGE}:${VERSION}" and podman untag "${IMAGE}:latest") to explicitly
state these commands only remove local references from the Podman image store
and do not affect tags or manifests in remote registries like GHCR, and add a
short note that remote cleanup must be performed separately (e.g., via registry
UI or registry-specific API/CLI).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c1b39763-5551-45f9-8fbc-589b57ef9e8a
📒 Files selected for processing (10)
.github/workflows/tools-publish.ymlcatalog/README.mdcatalog/entries/pengine-file-manager.jsoncatalog/v1/tools.jsondoc/tool-engine/manual-publish.mdsrc-tauri/src/modules/tool_engine/service.rssrc-tauri/src/modules/tool_engine/tools.jsonsrc-tauri/src/modules/tool_engine/types.rstools/file-manager/Dockerfiletools/file-manager/pengine-tool.json
✅ Files skipped from review due to trivial changes (4)
- tools/file-manager/Dockerfile
- tools/file-manager/pengine-tool.json
- catalog/v1/tools.json
- catalog/entries/pengine-file-manager.json
| elif [[ "${{ github.ref_type }}" == "tag" ]]; then | ||
| # Tag like "file-manager-v0.2.0" → slug "file-manager" | ||
| tag="${{ github.ref_name }}" | ||
| slug="${tag%-v*}" | ||
| if echo "$all_slugs" | grep -qx "$slug"; then | ||
| slugs="$slug" | ||
| else | ||
| echo "Tag '$tag' does not match any tool (marker + Dockerfile + catalog)" | ||
| echo 'skip=true' >> "$GITHUB_OUTPUT" | ||
| echo 'matrix=[]' >> "$GITHUB_OUTPUT" | ||
| exit 0 | ||
| fi |
There was a problem hiding this comment.
Reject release tags that do not point at main.
This branch only checks that the tag name maps to a known slug. A collaborator can still tag an unmerged or otherwise unreviewed commit and this workflow will build, sign, and publish a trusted image from it.
Suggested guard
- uses: actions/checkout@v4
+ - name: Ensure release tag points at main
+ if: github.ref_type == 'tag'
+ run: |
+ git fetch origin main --depth=1
+ git merge-base --is-ancestor "$GITHUB_SHA" origin/main || {
+ echo "Refusing to publish from a tag that is not reachable from main"
+ exit 1
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/tools-publish.yml around lines 60 - 71, The workflow
currently only validates that a tag maps to a known slug; add a guard that
verifies the tag actually points at main by fetching the remote main branch and
comparing the tag commit SHA to origin/main (or testing whether the tag commit
is an ancestor/equal to origin/main) before proceeding; locate the tag/slug
logic (variables tag and slug in the elif branch) and, after setting slug and
before setting slugs="$slug", fetch origin main (git fetch origin main
--depth=1) and compare commits (e.g., get tag_sha for "$tag" and main_sha for
origin/main or use git merge-base --is-ancestor) and if the tag does not point
at main echo the existing skip output and exit 0.
| "versions": [ | ||
| { | ||
| "version": "0.1.0", | ||
| "digest": "sha256:placeholder", | ||
| "released_at": "2026-04-15T00:00:00Z", | ||
| "yanked": false, | ||
| "revoked": false, | ||
| "security": false | ||
| } | ||
| ], |
There was a problem hiding this comment.
Replace the placeholder digest before shipping this catalog.
service.rs now treats sha256:placeholder as “pull by tag”, so this entry resolves to ghcr.io/...:0.1.0 instead of a pinned digest. That makes installs mutable and bypasses the digest-based allowlist this PR is adding.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src-tauri/src/modules/tool_engine/tools.json` around lines 14 - 23, The
versions entry in tools.json contains "digest": "sha256:placeholder" which
causes service.rs to treat it as a tag instead of a pinned image; replace that
placeholder with the real SHA-256 digest for the ghcr image that corresponds to
version "0.1.0" (format "sha256:<hex>") so the tool is resolved by digest, or if
you can't provide the digest yet remove/hold this version entry; update the
"digest" field only (leave "version" and "released_at" intact) and ensure the
hex is the full 64-character lowercase SHA-256 string.
- Deleted the `catalog/README.md`, `catalog/entries/pengine-file-manager.json`, and `catalog/v1/tools.json` files to streamline the project structure. - Removed the `catalog-publish.yml` workflow as it is no longer needed. - Updated `tools-publish.yml` to reflect changes in tool management and publishing processes.
There was a problem hiding this comment.
🧹 Nitpick comments (3)
.github/workflows/tools-publish.yml (1)
36-38: Consider increasingfetch-depthfor initial commits.With
fetch-depth: 2,git diff HEAD~1 HEADat line 62 will fail on the very first commit of a new repo (no parent). The|| truefallback on line 62 silently treats this as "no changes," which could skip builds unexpectedly.Suggested improvement
- uses: actions/checkout@v4 with: - fetch-depth: 2 + fetch-depth: 0Alternatively, handle the edge case explicitly in the script by checking if
HEAD~1exists.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/tools-publish.yml around lines 36 - 38, The workflow uses actions/checkout@v4 with fetch-depth: 2 which causes git diff HEAD~1 HEAD (used later) to fail on an initial commit; either increase fetch-depth to fetch full history (or at least a depth that includes the parent) by updating the actions/checkout step, or modify the script that runs git diff HEAD~1 HEAD to explicitly detect whether HEAD~1 exists (e.g., test git rev-parse --verify HEAD~1) and handle the initial-commit case instead of relying on the "|| true" fallback; update references for the actions/checkout step and the git diff invocation so the workflow reliably detects changes on first commit.src-tauri/src/modules/tool_engine/service.rs (2)
251-254: Minor: ConsiderVecDequefor the rolling buffer.
Vec::remove(0)is O(n). For a 50-element buffer this is negligible, butVecDequewould give O(1) push/pop from both ends if you want to optimize.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/modules/tool_engine/service.rs` around lines 251 - 254, stderr_tail currently uses Vec and calls stderr_tail.remove(0) to maintain a rolling 50-line buffer; change it to a VecDeque to get O(1) front pops: update the stderr_tail declaration to std::collections::VecDeque<String> (or whatever element type is used), import VecDeque, replace stderr_tail.push(...) with stderr_tail.push_back(...), and replace stderr_tail.remove(0) with stderr_tail.pop_front(); ensure any other places that index into stderr_tail are adjusted to use VecDeque methods or convert to a slice/Vec when needed.
450-474: Test depends on catalog state.This test relies on
pengine/file-managerhavingsha256:placeholderin the embedded catalog. When the placeholder is replaced with a real digest, this test will fail. Consider adding a note or making the test more explicit about this dependency.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/modules/tool_engine/service.rs` around lines 450 - 474, The test placeholder_digest_uses_tagged_image_in_argv depends on the embedded catalog containing ver.digest == "sha256:placeholder" (lookup in load_catalog(), fm id "pengine/file-manager", and podman_run_argv_for_tool), which will break when real digests replace the placeholder; update the test in placeholder_digest_uses_tagged_image_in_argv to make the dependency explicit by first reading ver.digest and branching: if ver.digest == "sha256:placeholder" assert the argv contains the tagged image (format!("{}:{}", fm.image, fm.current)) and no '@', otherwise assert the argv contains the image with the actual digest (image@digest) or skip the placeholder-specific assertions (or mark the test as ignored) so the test remains stable regardless of catalog state.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In @.github/workflows/tools-publish.yml:
- Around line 36-38: The workflow uses actions/checkout@v4 with fetch-depth: 2
which causes git diff HEAD~1 HEAD (used later) to fail on an initial commit;
either increase fetch-depth to fetch full history (or at least a depth that
includes the parent) by updating the actions/checkout step, or modify the script
that runs git diff HEAD~1 HEAD to explicitly detect whether HEAD~1 exists (e.g.,
test git rev-parse --verify HEAD~1) and handle the initial-commit case instead
of relying on the "|| true" fallback; update references for the actions/checkout
step and the git diff invocation so the workflow reliably detects changes on
first commit.
In `@src-tauri/src/modules/tool_engine/service.rs`:
- Around line 251-254: stderr_tail currently uses Vec and calls
stderr_tail.remove(0) to maintain a rolling 50-line buffer; change it to a
VecDeque to get O(1) front pops: update the stderr_tail declaration to
std::collections::VecDeque<String> (or whatever element type is used), import
VecDeque, replace stderr_tail.push(...) with stderr_tail.push_back(...), and
replace stderr_tail.remove(0) with stderr_tail.pop_front(); ensure any other
places that index into stderr_tail are adjusted to use VecDeque methods or
convert to a slice/Vec when needed.
- Around line 450-474: The test placeholder_digest_uses_tagged_image_in_argv
depends on the embedded catalog containing ver.digest == "sha256:placeholder"
(lookup in load_catalog(), fm id "pengine/file-manager", and
podman_run_argv_for_tool), which will break when real digests replace the
placeholder; update the test in placeholder_digest_uses_tagged_image_in_argv to
make the dependency explicit by first reading ver.digest and branching: if
ver.digest == "sha256:placeholder" assert the argv contains the tagged image
(format!("{}:{}", fm.image, fm.current)) and no '@', otherwise assert the argv
contains the image with the actual digest (image@digest) or skip the
placeholder-specific assertions (or mark the test as ignored) so the test
remains stable regardless of catalog state.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 681ea0cf-56a0-4617-8ae0-64e8c95d56e4
📒 Files selected for processing (6)
.github/workflows/tools-publish.ymldoc/tool-engine/manual-publish.mdsrc-tauri/src/modules/tool_engine/service.rssrc-tauri/src/modules/tool_engine/tools.jsonsrc-tauri/src/modules/tool_engine/types.rstools/file-manager/Dockerfile
✅ Files skipped from review due to trivial changes (1)
- tools/file-manager/Dockerfile
- Introduced `mcp-tools.json` as the central registry for tool metadata, replacing individual tool manifests. - Updated the CI workflow in `.github/workflows/tools-publish.yml` to utilize the new registry for building and publishing tool images. - Refactored tool loading functions to support asynchronous fetching of the remote catalog, with a fallback to the embedded catalog. - Enhanced documentation in `doc/tool-engine/manual-publish.md` to reflect the new registry structure and update processes. - Added a script `tools/update-upstream.sh` for managing upstream npm package versions directly in the registry. - Removed deprecated `pengine-tool.json` files from individual tool directories to streamline the project structure.
- Introduced `custom-mcp-tools.md` to provide a comprehensive guide on adding custom MCP tools via `mcp.json`, including examples for `stdio` and `native` types. - Implemented new API endpoints for managing custom tools, allowing users to list, add, and remove custom Docker images as MCP tools. - Enhanced the MCP service to support custom tool entries in the configuration, ensuring they are properly registered and managed. - Updated the frontend components to facilitate the addition of custom tools through a user-friendly interface.
- Introduced several Bash scripts in `.github/scripts/tools-publish/` to streamline the tool image publishing process. - `compute-image-tags.sh`: Generates image tags for Docker builds based on the current branch or tag. - `detect-matrix.sh`: Identifies tools to publish based on changes in the repository and outputs a JSON matrix. - `read-tool-manifest.sh`: Extracts tool metadata from the registry and writes it to GITHUB_OUTPUT. - `smoke-test-mcp.sh`: Performs a smoke test on the MCP initialization process using the published image. - `write-publish-summary.sh`: Appends a summary of the published tool to the GitHub Actions step summary. - Updated the CI workflow in `tools-publish.yml` to utilize these scripts for improved automation and clarity in the publishing process.
Summary by CodeRabbit
New Features
Improvements
Documentation