-
Notifications
You must be signed in to change notification settings - Fork 63
Implement as langchain codeinterpreter sandbox provider #318
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,73 @@ | ||
| # LangChain + AgentCube sandbox | ||
|
|
||
| Wire AgentCube **Code Interpreter** to [LangChain Deep Agents](https://docs.langchain.com/oss/python/deepagents/sandboxes) as the `backend`. | ||
|
|
||
| **Prerequisites**: AgentCube cluster with a `CodeInterpreter` CR deployed; see [getting-started](../../docs/getting-started.md). Local **Python >= 3.11**. | ||
|
|
||
| --- | ||
|
|
||
| ## 1. Install | ||
|
|
||
| From the repository root: | ||
|
|
||
| ```bash | ||
| pip install -e ./sdk-python | ||
| pip install -e ./integrations/langchain-agentcube | ||
| pip install langchain-openai # DeepSeek and OpenAI-compatible APIs | ||
| # pip install langchain-anthropic # only if you use Anthropic | ||
| ``` | ||
|
|
||
| --- | ||
|
|
||
| ## 2. Cluster environment variables | ||
|
|
||
| After `kubectl port-forward` to `workloadmanager` and `agentcube-router`: | ||
|
|
||
| ```bash | ||
| export WORKLOAD_MANAGER_URL="http://localhost:8080" | ||
| export ROUTER_URL="http://localhost:8081" | ||
| export AGENTCUBE_NAMESPACE="default" # same namespace as the CodeInterpreter CR | ||
| # export API_TOKEN="..." # if your cluster requires it | ||
| # export CODE_INTERPRETER_NAME=my-ci # optional; default my-interpreter | ||
| ``` | ||
|
|
||
| --- | ||
|
|
||
| ## 3. LLM API keys | ||
|
|
||
| **DeepSeek** (OpenAI-compatible; `pip install langchain-openai`): | ||
|
|
||
| ```bash | ||
| export DEEPSEEK_API_KEY="sk-..." | ||
| # Optional: DEEPSEEK_API_BASE (default https://api.deepseek.com/v1), DEEPSEEK_MODEL (default deepseek-chat) | ||
| ``` | ||
|
|
||
| **Anthropic Claude** (`pip install langchain-anthropic`): | ||
|
|
||
| ```bash | ||
| export ANTHROPIC_API_KEY="sk-ant-..." | ||
| # Optional: ANTHROPIC_MODEL (default claude-haiku-4-5-20251001) | ||
| ``` | ||
|
|
||
| **OpenAI GPT** (same `langchain-openai` package): | ||
|
|
||
| ```bash | ||
| export OPENAI_API_KEY="sk-..." | ||
| # Optional: OPENAI_MODEL (default gpt-4o-mini); set OPENAI_API_BASE for proxies or compatible gateways | ||
| ``` | ||
|
|
||
| If several keys are set, the example script prefers **DeepSeek, then Claude, then GPT**. For DeepSeek you can instead set only `OPENAI_API_KEY` + `OPENAI_API_BASE=https://api.deepseek.com/v1` + `OPENAI_MODEL=deepseek-chat` without `DEEPSEEK_API_KEY`. | ||
|
|
||
| --- | ||
|
|
||
| ## 4. Run the example | ||
|
|
||
| ```bash | ||
| python integrations/langchain-agentcube/example/deep_agent_sandbox.py | ||
| ``` | ||
|
|
||
| Flags: `--interpreter`, `--namespace`, `--prompt`. A full fix is to rename the CLI top-level package (e.g. `agentcube_cli`) in the future. | ||
|
|
||
| --- | ||
|
|
||
| Everything below uses **agentcube-sdk** plus Router / Workload Manager. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,109 @@ | ||
| #!/usr/bin/env python3 | ||
| # Copyright The Volcano Authors. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
|
|
||
| """Example: create_deep_agent with AgentcubeSandbox. See ../README.md.""" | ||
|
|
||
| from __future__ import annotations | ||
|
|
||
| import argparse | ||
| import os | ||
| import sys | ||
| from pathlib import Path | ||
| from typing import Any | ||
|
|
||
|
|
||
| def _require_env(name: str) -> str: | ||
| v = os.environ.get(name) | ||
| if not v: | ||
| print(f"Error: environment variable {name} is not set", file=sys.stderr) | ||
| sys.exit(1) | ||
| return v | ||
|
|
||
|
|
||
| def _make_chat_model() -> object: | ||
| if os.environ.get("DEEPSEEK_API_KEY"): | ||
| from langchain_openai import ChatOpenAI | ||
|
|
||
| return ChatOpenAI( | ||
| model=os.environ.get("DEEPSEEK_MODEL", "deepseek-chat"), | ||
| api_key=os.environ["DEEPSEEK_API_KEY"], | ||
| base_url=os.environ.get("DEEPSEEK_API_BASE", "https://api.deepseek.com/v1"), | ||
| temperature=0, | ||
| ) | ||
| if os.environ.get("ANTHROPIC_API_KEY"): | ||
| from langchain_anthropic import ChatAnthropic | ||
|
|
||
| return ChatAnthropic( | ||
| model=os.environ.get("ANTHROPIC_MODEL", "claude-haiku-4-5-20251001"), | ||
| temperature=0, | ||
| ) | ||
| if os.environ.get("OPENAI_API_KEY"): | ||
| from langchain_openai import ChatOpenAI | ||
|
|
||
| kwargs: dict[str, Any] = { | ||
| "model": os.environ.get("OPENAI_MODEL", "gpt-4o-mini"), | ||
| "api_key": os.environ["OPENAI_API_KEY"], | ||
| "temperature": 0, | ||
| } | ||
| if os.environ.get("OPENAI_API_BASE"): | ||
| kwargs["base_url"] = os.environ["OPENAI_API_BASE"] | ||
| return ChatOpenAI(**kwargs) | ||
| print( | ||
| "Error: set DEEPSEEK_API_KEY, ANTHROPIC_API_KEY, or OPENAI_API_KEY.", | ||
| file=sys.stderr, | ||
| ) | ||
| sys.exit(1) | ||
|
|
||
|
|
||
| def main() -> None: | ||
| parser = argparse.ArgumentParser(description="Deep Agents + AgentcubeSandbox") | ||
| parser.add_argument( | ||
| "--interpreter", | ||
| default=os.environ.get("CODE_INTERPRETER_NAME", "my-interpreter"), | ||
| ) | ||
| parser.add_argument( | ||
| "--namespace", | ||
| default=os.environ.get("AGENTCUBE_NAMESPACE", "default"), | ||
| ) | ||
| parser.add_argument( | ||
| "--prompt", | ||
| default="Write and run a Python script to print 'Hello, World!'", | ||
| ) | ||
| args = parser.parse_args() | ||
|
|
||
| _require_env("WORKLOAD_MANAGER_URL") | ||
| _require_env("ROUTER_URL") | ||
| sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "sdk-python")) | ||
|
|
||
| from agentcube import CodeInterpreterClient | ||
| from deepagents import create_deep_agent | ||
| from langchain_agentcube import AgentcubeSandbox | ||
|
|
||
| model = _make_chat_model() | ||
| with CodeInterpreterClient( | ||
| name=args.interpreter, | ||
| namespace=args.namespace, | ||
| router_url=os.environ["ROUTER_URL"], | ||
| workload_manager_url=os.environ["WORKLOAD_MANAGER_URL"], | ||
| auth_token=os.environ.get("API_TOKEN"), | ||
| ) as client: | ||
| agent = create_deep_agent( | ||
| model=model, | ||
| system_prompt="You are a coding assistant with sandbox access.", | ||
| backend=AgentcubeSandbox(client=client), | ||
| ) | ||
| try: | ||
| result = agent.invoke( | ||
| {"messages": [{"role": "user", "content": args.prompt}]}, | ||
| ) | ||
| except Exception as e: | ||
| print(f"Error: {e}", file=sys.stderr) | ||
| sys.exit(1) | ||
| print(result) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| # Copyright The Volcano Authors. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
|
|
||
| from langchain_agentcube.sandbox import AgentcubeSandbox | ||
|
|
||
| __all__ = ["AgentcubeSandbox"] |
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I am thinking about contributing to upstream lang chain organization if possible
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. about what. The agentcube Code Interpreter?
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This implement, you can see aws agentcore is there. |
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -0,0 +1,117 @@ | ||||||
| # Copyright The Volcano Authors. | ||||||
| # | ||||||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||||||
| # you may not use this file except in compliance with the License. | ||||||
|
|
||||||
| """AgentCube Code Interpreter as a Deep Agents ``BaseSandbox`` backend.""" | ||||||
|
|
||||||
| from __future__ import annotations | ||||||
|
|
||||||
| import os | ||||||
| import tempfile | ||||||
| from typing import TYPE_CHECKING | ||||||
|
|
||||||
| from deepagents.backends.protocol import ( | ||||||
| ExecuteResponse, | ||||||
| FileDownloadResponse, | ||||||
| FileUploadResponse, | ||||||
| ) | ||||||
| from deepagents.backends.sandbox import BaseSandbox | ||||||
|
|
||||||
| if TYPE_CHECKING: | ||||||
| from agentcube import CodeInterpreterClient | ||||||
|
|
||||||
|
|
||||||
| def _normalize_remote_path(path: str) -> str: | ||||||
| """Map paths from Deep Agents (often absolute) to session workspace-relative paths.""" | ||||||
| return path.replace("\\", "/").strip().lstrip("/") | ||||||
|
|
||||||
|
|
||||||
|
|
||||||
| class AgentcubeSandbox(BaseSandbox): | ||||||
| """Wraps :class:`~agentcube.CodeInterpreterClient` for ``create_deep_agent(..., backend=...)``.""" | ||||||
|
|
||||||
| def __init__( | ||||||
| self, | ||||||
| *, | ||||||
| client: CodeInterpreterClient, | ||||||
| default_timeout: int | None = 30 * 60, | ||||||
| ) -> None: | ||||||
| self._client = client | ||||||
| self._default_timeout = default_timeout | ||||||
|
|
||||||
| @property | ||||||
| def id(self) -> str: | ||||||
| sid = self._client.session_id | ||||||
| return sid if sid else "agentcube-unknown" | ||||||
|
|
||||||
| def execute( | ||||||
| self, | ||||||
| command: str, | ||||||
| *, | ||||||
| timeout: int | None = None, | ||||||
| ) -> ExecuteResponse: | ||||||
| eff = timeout if timeout is not None else self._default_timeout | ||||||
| to = float(eff) if eff is not None else None | ||||||
| r = self._client.execute_command_result(command, timeout=to) | ||||||
| out = r.get("stdout") or "" | ||||||
| stderr = (r.get("stderr") or "").strip() | ||||||
| if stderr: | ||||||
| out += f"\n<stderr>{stderr}</stderr>" | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Appending stderr with a leading newline can result in unexpected output formatting if
Suggested change
|
||||||
| return ExecuteResponse( | ||||||
| output=out, | ||||||
| exit_code=int(r.get("exit_code", -1)), | ||||||
| truncated=False, | ||||||
| ) | ||||||
|
|
||||||
| def upload_files(self, files: list[tuple[str, bytes]]) -> list[FileUploadResponse]: | ||||||
| responses: list[FileUploadResponse] = [] | ||||||
| for path, content in files: | ||||||
| rel = _normalize_remote_path(path) | ||||||
| if not rel: | ||||||
| responses.append(FileUploadResponse(path=path, error="invalid_path")) | ||||||
| continue | ||||||
| tmp_path: str | None = None | ||||||
| try: | ||||||
| fd, tmp_path = tempfile.mkstemp(prefix="agentcube-upload-", suffix=".bin") | ||||||
| with os.fdopen(fd, "wb") as f: | ||||||
| f.write(content) | ||||||
| self._client.upload_file(tmp_path, rel) | ||||||
| responses.append(FileUploadResponse(path=path, error=None)) | ||||||
| except Exception as e: | ||||||
| responses.append(FileUploadResponse(path=path, error=str(e))) | ||||||
| finally: | ||||||
| if tmp_path: | ||||||
| try: | ||||||
| os.unlink(tmp_path) | ||||||
| except OSError: | ||||||
| pass | ||||||
| return responses | ||||||
|
|
||||||
| def download_files(self, paths: list[str]) -> list[FileDownloadResponse]: | ||||||
| responses: list[FileDownloadResponse] = [] | ||||||
| for path in paths: | ||||||
| rel = _normalize_remote_path(path) | ||||||
| if not rel: | ||||||
| responses.append( | ||||||
| FileDownloadResponse(path=path, content=None, error="invalid_path") | ||||||
| ) | ||||||
| continue | ||||||
| fd, tmp_path = tempfile.mkstemp(prefix="agentcube-dl-", suffix=".bin") | ||||||
| os.close(fd) | ||||||
| try: | ||||||
| try: | ||||||
| self._client.download_file(rel, tmp_path) | ||||||
| except Exception as e: # noqa: BLE001 | ||||||
| responses.append( | ||||||
| FileDownloadResponse(path=path, content=None, error=str(e)) | ||||||
| ) | ||||||
| continue | ||||||
| with open(tmp_path, "rb") as f: | ||||||
| data = f.read() | ||||||
| responses.append(FileDownloadResponse(path=path, content=data, error=None)) | ||||||
| finally: | ||||||
| try: | ||||||
| os.unlink(tmp_path) | ||||||
| except OSError: | ||||||
| pass | ||||||
| return responses | ||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,24 @@ | ||
| # Copyright The Volcano Authors. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
|
|
||
| [build-system] | ||
| requires = ["setuptools>=61.0", "wheel"] | ||
| build-backend = "setuptools.build_meta" | ||
|
|
||
| [project] | ||
| name = "langchain-agentcube" | ||
| version = "0.1.0" | ||
| description = "LangChain Deep Agents sandbox backend for AgentCube Code Interpreter" | ||
| readme = "README.md" | ||
| license = "Apache-2.0" | ||
| requires-python = ">=3.11" | ||
| dependencies = [ | ||
| "deepagents>=0.5.0,<0.6", | ||
| "agentcube-sdk>=0.1.0", | ||
| ] | ||
|
|
||
| [tool.setuptools.packages.find] | ||
| where = ["."] | ||
| include = ["langchain_agentcube*"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hardcoding a specific dated model version like
claude-haiku-4-5-20251001as a default can lead to maintenance issues when the model is deprecated or updated. Consider using a more stable alias (e.g.,claude-3-5-haiku-latest) or requiring the user to provide the model via environment variables without a hardcoded default.