Skip to content

feat: integrate Tauri event bridge and refactor MCP tools management#19

Merged
maximedogawa merged 3 commits into
mainfrom
18-feature-docker-management-via-messaging
Apr 12, 2026
Merged

feat: integrate Tauri event bridge and refactor MCP tools management#19
maximedogawa merged 3 commits into
mainfrom
18-feature-docker-management-via-messaging

Conversation

@maximedogawa
Copy link
Copy Markdown
Collaborator

@maximedogawa maximedogawa commented Apr 12, 2026

  • Added initTauriRegistryBridge to connect Tauri events with browser events for MCP tools.
  • Introduced useRegistryChanged hook to simplify registry change handling in McpToolsPanel and ToolEnginePanel.
  • Updated tool_manager integration in the MCP service to ensure consistent tool management.
  • Enhanced error handling and async support in tool management functions.
  • Refactored tests to support async operations for improved reliability.

Summary by CodeRabbit

  • New Features

    • Added a tool manager to list, install, and uninstall MCP tools from the app.
    • App now emits registry-change notifications to the UI and initializes a Tauri→browser bridge on startup.
  • UX / Sync

    • UI panels auto-reload when the MCP registry changes for real-time updates.
    • Tool runtime panel refreshes state after install/uninstall failures.
  • Tests

    • Integration tests updated to async to match async tool calls.

- Added `initTauriRegistryBridge` to connect Tauri events with browser events for MCP tools.
- Introduced `useRegistryChanged` hook to simplify registry change handling in `McpToolsPanel` and `ToolEnginePanel`.
- Updated `tool_manager` integration in the MCP service to ensure consistent tool management.
- Enhanced error handling and async support in tool management functions.
- Refactored tests to support async operations for improved reliability.
@maximedogawa maximedogawa self-assigned this Apr 12, 2026
@maximedogawa maximedogawa linked an issue Apr 12, 2026 that may be closed by this pull request
3 tasks
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 12, 2026

📝 Walkthrough

Walkthrough

Adds a native tool_manager MCP provider (list/install/uninstall), makes native providers async with a kind discriminator, emits a registry-changed event from the backend into Tauri, adds a Tauri→browser bridge and a React hook for registry changes, and updates call sites and tests to use async providers.

Changes

Cohort / File(s) Summary
Configuration
src-tauri/mcp.example.json
Added a servers.tool_manager native server entry.
Native provider & tools
src-tauri/src/modules/mcp/native.rs
Refactored NativeProvider to use NativeKind (Dice
Registry / service
src-tauri/src/modules/mcp/registry.rs, src-tauri/src/modules/mcp/service.rs
Await native provider calls in registry, threaded AppState into connect_one_server, emit pengine-registry-changed after successful rebuild, auto-insert tool_manager into config and persist on rebuild when missing.
Tests
src-tauri/tests/mcp_tools.rs
Converted three dice tool tests to #[tokio::test] async fn and await provider calls.
Frontend bridge & hook
src/shared/mcpEvents.ts, src/shared/useRegistryChanged.ts
Added initTauriRegistryBridge() to forward Tauri registry events to window events; added useRegistryChanged(callback) hook to subscribe to PENGINE_MCP_REGISTRY_CHANGED.
Frontend usage
src/main.tsx, src/modules/mcp/components/McpToolsPanel.tsx, src/modules/toolengine/components/ToolEnginePanel.tsx
Initialize bridge at app startup; replace inline window listener logic with useRegistryChanged() in panels and adjust ToolEnginePanel to conditionally refresh after install/uninstall failures.

Sequence Diagram(s)

sequenceDiagram
    participant FE as React Frontend
    participant Bridge as Tauri Bridge
    participant Service as MCP Service
    participant Native as NativeProvider (tool_manager)
    participant ToolEngine as Tool Engine Service

    FE->>Bridge: user triggers install/uninstall via UI
    Bridge->>Service: request action / rebuild_registry_into_state
    Service->>Native: connect_one_server(..., tool_manager) / call(action)
    Native->>ToolEngine: install_tool() or uninstall_tool()
    ToolEngine-->>Native: operation result
    Native-->>Service: return Ok/Err
    Service->>Service: rebuild_registry_into_state()
    Service->>Bridge: emit `pengine-registry-changed`
    Bridge-->>FE: forwards as window `PENGINE_MCP_REGISTRY_CHANGED`
    FE->>FE: useRegistryChanged callback reloads UI
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

Poem

🐰
Tools hop in and out with tiny cheer,
Events scurry over, whispered in my ear,
I nudge the registry, give a little thump,
Install, list, uninstall — then back to my stump!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main changes: integrating a Tauri event bridge and refactoring MCP tools management across the codebase.
Docstring Coverage ✅ Passed Docstring coverage is 88.46% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch 18-feature-docker-management-via-messaging

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
src/modules/toolengine/components/ToolEnginePanel.tsx (1)

56-57: Avoid duplicate refreshes after local install/uninstall actions.

With the new listener at Line 56, successful actions can trigger loadData() twice (once via dispatched registry event, once via explicit handler call). Consider using a single refresh path to reduce redundant fetches.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/modules/toolengine/components/ToolEnginePanel.tsx` around lines 56 - 57,
The new useRegistryChanged listener causes loadData to run twice for local
install/uninstall flows (once from the dispatched registry event and once from
the explicit success handler); fix by consolidating into a single refresh
path—either remove the explicit handler call and rely on useRegistryChanged to
invoke loadData, or add a guard/flag (e.g., a local boolean or a requestId) that
the explicit handler sets (or the registry event carries) so loadData
short-circuits when the refresh was already triggered; update references in
ToolEnginePanel around useRegistryChanged and the install/uninstall success
handlers so only one codepath calls loadData.
src-tauri/src/modules/mcp/native.rs (1)

144-167: Consider holding mcp_config_mutex when reading installed tool IDs.

installed_tool_ids at line 146 reads the config file without acquiring mcp_config_mutex. If a concurrent install/uninstall is mid-write, this could read a partial or inconsistent file. While the worst case is an incorrect "installed" status display (not data corruption), acquiring the mutex would ensure consistency.

That said, this is a read-only listing for display purposes, so the risk is low. Mark as optional improvement.

🔧 Optional: Acquire config mutex for consistent reads
 async fn handle_list_tools(state: &AppState) -> Result<String, String> {
     let catalog = tool_engine_service::load_catalog()?;
+    let _cfg_guard = state.mcp_config_mutex.lock().await;
     let installed = tool_engine_service::installed_tool_ids(&state.mcp_config_path);
     let installed_set: HashSet<&str> = installed.iter().map(|s| s.as_str()).collect();
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/modules/mcp/native.rs` around lines 144 - 167, The list-tools
handler (handle_list_tools) calls tool_engine_service::installed_tool_ids
without holding the application's mcp_config_mutex, risking reading a
concurrently-written config; acquire the AppState's mcp_config_mutex (or
otherwise lock the same mutex used by install/uninstall) around the
installed_tool_ids call to ensure a consistent read, then release the mutex
before proceeding to build the catalog output. Use the state.mcp_config_mutex
lock guard to protect the read so it matches the install/uninstall protection.
src-tauri/src/modules/mcp/service.rs (1)

266-275: Consider logging a warning if config save fails during auto-insert.

Line 274 silently discards the save_config result. If the save fails, the in-memory config will include tool_manager but the disk won't reflect this, causing repeated auto-insert attempts on each rebuild. This is harmless but slightly wasteful.

🔧 Optional: Log on save failure
             );
-            let _ = save_config(&state.mcp_config_path, &cfg);
+            if let Err(e) = save_config(&state.mcp_config_path, &cfg) {
+                // Non-fatal: tool_manager will be re-added on next rebuild
+                tracing::warn!("failed to persist tool_manager auto-add: {e}");
+            }
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/modules/mcp/service.rs` around lines 266 - 275, The code
silently ignores save_config result when auto-inserting the tool manager; update
the block that inserts native::TOOL_MANAGER_ID (and the subsequent call to
save_config(&state.mcp_config_path, &cfg)) to check the Result and log a warning
on Err (including the error and context such as native::TOOL_MANAGER_ID and
state.mcp_config_path) so failures are visible; keep the insertion in-memory but
emit a process- or module-level warning via the existing logger or eprintln! to
aid debugging (refer to save_config, cfg, native::TOOL_MANAGER_ID,
ServerEntry::Native, and state.mcp_config_path).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/shared/mcpEvents.ts`:
- Around line 16-26: initTauriRegistryBridge currently creates a new Tauri event
listener on every call, causing duplicate notifications; make it idempotent by
introducing a module-level boolean (e.g., tauriBridgeInitialized) and return
early if true, then only call import(...).then(listen...) when not initialized
and set tauriBridgeInitialized = true after successful listen; reference the
existing symbols TAURI_REGISTRY_CHANGED and notifyMcpRegistryChanged and ensure
the flag is only set on successful listener registration (do not set it inside
the catch).

---

Nitpick comments:
In `@src-tauri/src/modules/mcp/native.rs`:
- Around line 144-167: The list-tools handler (handle_list_tools) calls
tool_engine_service::installed_tool_ids without holding the application's
mcp_config_mutex, risking reading a concurrently-written config; acquire the
AppState's mcp_config_mutex (or otherwise lock the same mutex used by
install/uninstall) around the installed_tool_ids call to ensure a consistent
read, then release the mutex before proceeding to build the catalog output. Use
the state.mcp_config_mutex lock guard to protect the read so it matches the
install/uninstall protection.

In `@src-tauri/src/modules/mcp/service.rs`:
- Around line 266-275: The code silently ignores save_config result when
auto-inserting the tool manager; update the block that inserts
native::TOOL_MANAGER_ID (and the subsequent call to
save_config(&state.mcp_config_path, &cfg)) to check the Result and log a warning
on Err (including the error and context such as native::TOOL_MANAGER_ID and
state.mcp_config_path) so failures are visible; keep the insertion in-memory but
emit a process- or module-level warning via the existing logger or eprintln! to
aid debugging (refer to save_config, cfg, native::TOOL_MANAGER_ID,
ServerEntry::Native, and state.mcp_config_path).

In `@src/modules/toolengine/components/ToolEnginePanel.tsx`:
- Around line 56-57: The new useRegistryChanged listener causes loadData to run
twice for local install/uninstall flows (once from the dispatched registry event
and once from the explicit success handler); fix by consolidating into a single
refresh path—either remove the explicit handler call and rely on
useRegistryChanged to invoke loadData, or add a guard/flag (e.g., a local
boolean or a requestId) that the explicit handler sets (or the registry event
carries) so loadData short-circuits when the refresh was already triggered;
update references in ToolEnginePanel around useRegistryChanged and the
install/uninstall success handlers so only one codepath calls loadData.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 9dd4da85-3c95-412f-b74a-e01c113a4988

📥 Commits

Reviewing files that changed from the base of the PR and between 676d3f5 and 77621e5.

📒 Files selected for processing (10)
  • src-tauri/mcp.example.json
  • src-tauri/src/modules/mcp/native.rs
  • src-tauri/src/modules/mcp/registry.rs
  • src-tauri/src/modules/mcp/service.rs
  • src-tauri/tests/mcp_tools.rs
  • src/main.tsx
  • src/modules/mcp/components/McpToolsPanel.tsx
  • src/modules/toolengine/components/ToolEnginePanel.tsx
  • src/shared/mcpEvents.ts
  • src/shared/useRegistryChanged.ts

Comment thread src/shared/mcpEvents.ts
- Updated Tool Engine panel to ensure data is reloaded after installation and uninstallation failures.
- Enhanced error handling to provide clearer feedback on action failures.
- Refactored MCP service to include proper locking mechanisms when accessing installed tool IDs, improving thread safety.
- Added logging for failed configuration saves to aid in debugging.
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
src/shared/mcpEvents.ts (1)

33-36: Consider adding dev-time logging for bridge setup failures.

Right now failures are fully silent. A small debug log would help distinguish “expected non-Tauri runtime” from real bridge setup regressions during development.

🔧 Minimal refactor
-    .catch(() => {
+    .catch((error) => {
       tauriBridgeListenPromise = null;
-      // Not running inside Tauri shell — no bridge needed.
+      // Not running inside Tauri shell — no bridge needed.
+      if (import.meta.env.DEV) {
+        console.debug("[mcpEvents] Tauri registry bridge init skipped/failed", error);
+      }
     });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/shared/mcpEvents.ts` around lines 33 - 36, The catch on the tauri bridge
setup currently swallows all errors; update the catch handler for the promise
that assigns tauriBridgeListenPromise to also log a debug-level message with the
caught error and a short context string (e.g., "tauri bridge listen setup failed
— likely non-Tauri runtime") so developers can distinguish expected non-Tauri
runs from real regressions; ensure the log is dev-only (use a NODE_ENV /
import.meta.env.MODE check or console.debug) and keep the existing behavior of
setting tauriBridgeListenPromise = null in that catch block.
src-tauri/src/modules/mcp/native.rs (1)

93-107: Tighten manage_tools schema to enforce tool_id per action.

Right now only action is required structurally, so invalid install/uninstall calls are accepted by schema and fail later at runtime. Encode action-specific requirements in JSON schema (oneOf + per-action required).

Suggested schema shape
 input_schema: json!({
   "type": "object",
-  "required": ["action"],
   "properties": {
     "action": {
       "type": "string",
       "enum": ["list", "install", "uninstall"],
       "description": "The operation to perform"
     },
     "tool_id": {
       "type": "string",
       "description": "Tool identifier (required for install/uninstall, e.g. 'pengine/file-manager')"
     }
-  }
+  },
+  "oneOf": [
+    {
+      "required": ["action"],
+      "properties": { "action": { "const": "list" } }
+    },
+    {
+      "required": ["action", "tool_id"],
+      "properties": { "action": { "const": "install" } }
+    },
+    {
+      "required": ["action", "tool_id"],
+      "properties": { "action": { "const": "uninstall" } }
+    }
+  ]
 })
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/modules/mcp/native.rs` around lines 93 - 107, The input_schema
for the manage_tools command currently only requires "action" so
install/uninstall requests missing tool_id pass validation; update the JSON
schema assigned to input_schema (in the manage_tools handler) to use oneOf with
three subschemas: one for {"action":"list"} with no tool_id required, one for
{"action":"install"} that adds "tool_id" in required, and one for
{"action":"uninstall"} that also requires "tool_id"; keep the existing
"properties" and enum for action but split into per-action subschemas so
install/uninstall calls fail schema validation if tool_id is omitted.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src-tauri/src/modules/mcp/native.rs`:
- Around line 230-235: The current code swallows errors from
mcp_service::rebuild_registry_into_state(state).await by logging and returning
Ok(()), so change the flow to propagate the failure: after the emit_log await,
return Err(e) (or map e into the function's Result error type) instead of
continuing to Ok(()); update the match/if-let to preserve the original error
variable (e) and return it (or use .map_err(|e| { state.emit_log(...).await; e
}) on the future) so callers of this function see the rebuild failure rather
than a spurious success.

---

Nitpick comments:
In `@src-tauri/src/modules/mcp/native.rs`:
- Around line 93-107: The input_schema for the manage_tools command currently
only requires "action" so install/uninstall requests missing tool_id pass
validation; update the JSON schema assigned to input_schema (in the manage_tools
handler) to use oneOf with three subschemas: one for {"action":"list"} with no
tool_id required, one for {"action":"install"} that adds "tool_id" in required,
and one for {"action":"uninstall"} that also requires "tool_id"; keep the
existing "properties" and enum for action but split into per-action subschemas
so install/uninstall calls fail schema validation if tool_id is omitted.

In `@src/shared/mcpEvents.ts`:
- Around line 33-36: The catch on the tauri bridge setup currently swallows all
errors; update the catch handler for the promise that assigns
tauriBridgeListenPromise to also log a debug-level message with the caught error
and a short context string (e.g., "tauri bridge listen setup failed — likely
non-Tauri runtime") so developers can distinguish expected non-Tauri runs from
real regressions; ensure the log is dev-only (use a NODE_ENV /
import.meta.env.MODE check or console.debug) and keep the existing behavior of
setting tauriBridgeListenPromise = null in that catch block.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 68f3a162-66d5-49b0-b453-1908fd112e42

📥 Commits

Reviewing files that changed from the base of the PR and between 77621e5 and e28d957.

📒 Files selected for processing (4)
  • src-tauri/src/modules/mcp/native.rs
  • src-tauri/src/modules/mcp/service.rs
  • src/modules/toolengine/components/ToolEnginePanel.tsx
  • src/shared/mcpEvents.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/modules/toolengine/components/ToolEnginePanel.tsx
  • src-tauri/src/modules/mcp/service.rs

Comment thread src-tauri/src/modules/mcp/native.rs
- Revised descriptions in the tool manager to clarify user actions and tool management processes.
- Enhanced error handling in the `run_tool_mutation` function to ensure proper error propagation.
- Removed unnecessary comments in the Tauri registry bridge initialization for cleaner code.
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (2)
src/shared/mcpEvents.ts (1)

33-35: Don’t silently swallow bridge init failures.

The catch path resets state, but it hides why initialization failed. Adding minimal logging/telemetry here will make startup regressions much easier to diagnose.

Suggested adjustment
-    .catch(() => {
+    .catch((error) => {
       tauriBridgeListenPromise = null;
+      console.warn("Failed to initialize Tauri registry bridge", error);
     });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/shared/mcpEvents.ts` around lines 33 - 35, The catch handler for the
tauriBridgeListenPromise currently swallows errors; update the catch on
tauriBridgeListenPromise to accept the error (e.g., catch((err) => {...})), log
the failure (use the existing logger/telemetry utility in this module—e.g.,
processLogger.error or sendTelemetry) with a clear message including the
error/details, then keep the existing state reset (tauriBridgeListenPromise =
null); reference tauriBridgeListenPromise and the existing catch block when
making the change.
src-tauri/src/modules/mcp/native.rs (1)

146-171: Prefer structured list output over formatted prose.

install/uninstall depend on an exact tool_id, and this bullet-list format makes downstream parsing brittle. Returning JSON here would make the tool chain more reliable and keep presentation concerns in the caller.

♻️ Possible refactor
 async fn handle_list_tools(state: &AppState) -> Result<String, String> {
     let catalog = tool_engine_service::load_catalog()?;
     let installed = {
         let _cfg_guard = state.mcp_config_mutex.lock().await;
         tool_engine_service::installed_tool_ids(&state.mcp_config_path)
     };
     let installed_set: HashSet<&str> = installed.iter().map(|s| s.as_str()).collect();

-    let mut lines = Vec::new();
-    for tool in &catalog.tools {
-        let status = if installed_set.contains(tool.id.as_str()) {
-            "installed"
-        } else {
-            "not installed"
-        };
-        lines.push(format!(
-            "- {} (id: {}, v{}): {} [{}]",
-            tool.name, tool.id, tool.version, tool.description, status
-        ));
-    }
-
-    if lines.is_empty() {
-        Ok("No tools available in the catalog.".to_string())
-    } else {
-        Ok(format!("Available tools:\n{}", lines.join("\n")))
-    }
+    let tools: Vec<Value> = catalog
+        .tools
+        .iter()
+        .map(|tool| {
+            json!({
+                "id": tool.id,
+                "name": tool.name,
+                "version": tool.version,
+                "description": tool.description,
+                "installed": installed_set.contains(tool.id.as_str()),
+            })
+        })
+        .collect();
+
+    Ok(json!({ "tools": tools }).to_string())
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/modules/mcp/native.rs` around lines 146 - 171,
handle_list_tools currently builds a human-readable bullet list string which is
brittle for downstream parsing; change it to return a JSON array string where
each item is an object with fields like id, name, version, description, and
status. Use tool_engine_service::load_catalog() and
tool_engine_service::installed_tool_ids(&state.mcp_config_path) as before
(keeping the mcp_config_mutex lock around the config access), build a Vec of
structs or maps representing each tool and its computed status ("installed" /
"not_installed"), then serialize that Vec to a JSON string via
serde_json::to_string(...) and return it as Ok(json_string). Ensure errors from
load_catalog and installed_tool_ids propagate as before.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@src-tauri/src/modules/mcp/native.rs`:
- Around line 146-171: handle_list_tools currently builds a human-readable
bullet list string which is brittle for downstream parsing; change it to return
a JSON array string where each item is an object with fields like id, name,
version, description, and status. Use tool_engine_service::load_catalog() and
tool_engine_service::installed_tool_ids(&state.mcp_config_path) as before
(keeping the mcp_config_mutex lock around the config access), build a Vec of
structs or maps representing each tool and its computed status ("installed" /
"not_installed"), then serialize that Vec to a JSON string via
serde_json::to_string(...) and return it as Ok(json_string). Ensure errors from
load_catalog and installed_tool_ids propagate as before.

In `@src/shared/mcpEvents.ts`:
- Around line 33-35: The catch handler for the tauriBridgeListenPromise
currently swallows errors; update the catch on tauriBridgeListenPromise to
accept the error (e.g., catch((err) => {...})), log the failure (use the
existing logger/telemetry utility in this module—e.g., processLogger.error or
sendTelemetry) with a clear message including the error/details, then keep the
existing state reset (tauriBridgeListenPromise = null); reference
tauriBridgeListenPromise and the existing catch block when making the change.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 8b114fdf-a511-43bf-9885-766990f3ffaf

📥 Commits

Reviewing files that changed from the base of the PR and between e28d957 and 9c92448.

📒 Files selected for processing (2)
  • src-tauri/src/modules/mcp/native.rs
  • src/shared/mcpEvents.ts

@maximedogawa maximedogawa merged commit 2be8b73 into main Apr 12, 2026
1 check passed
@maximedogawa maximedogawa deleted the 18-feature-docker-management-via-messaging branch April 12, 2026 11:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Docker management via messaging

1 participant