From 46be044e82f88fac18da8438adf20f6510add89b Mon Sep 17 00:00:00 2001 From: openhands Date: Wed, 4 Feb 2026 16:01:15 +0000 Subject: [PATCH] Add AWS Bedrock configuration guide for SDK users - Add openhands/usage/llms/bedrock-llms.mdx for self-hosted OpenHands - Add sdk/guides/llm-provider-bedrock.mdx for SDK users - Add new 'LLM Providers' group in SDK navigation - Update docs.json with navigation entries - Update llms.mdx provider list - Update sdk/arch/llm.mdx providers table Co-authored-by: openhands --- docs.json | 7 ++ openhands/usage/llms/bedrock-llms.mdx | 75 ++++++++++++ openhands/usage/llms/llms.mdx | 1 + sdk/arch/llm.mdx | 1 + sdk/guides/llm-provider-bedrock.mdx | 169 ++++++++++++++++++++++++++ 5 files changed, 253 insertions(+) create mode 100644 openhands/usage/llms/bedrock-llms.mdx create mode 100644 sdk/guides/llm-provider-bedrock.mdx diff --git a/docs.json b/docs.json index 05a29914..f1a7c900 100644 --- a/docs.json +++ b/docs.json @@ -138,6 +138,7 @@ "pages": [ "openhands/usage/llms/openhands-llms", "openhands/usage/llms/azure-llms", + "openhands/usage/llms/bedrock-llms", "openhands/usage/llms/google-llms", "openhands/usage/llms/groq", "openhands/usage/llms/local-llms", @@ -293,6 +294,12 @@ "sdk/guides/llm-error-handling" ] }, + { + "group": "LLM Providers", + "pages": [ + "sdk/guides/llm-provider-bedrock" + ] + }, { "group": "Agent Features", "pages": [ diff --git a/openhands/usage/llms/bedrock-llms.mdx b/openhands/usage/llms/bedrock-llms.mdx new file mode 100644 index 00000000..a9ba5ca6 --- /dev/null +++ b/openhands/usage/llms/bedrock-llms.mdx @@ -0,0 +1,75 @@ +--- +title: AWS Bedrock +description: OpenHands uses LiteLLM to make calls to AWS Bedrock models. You can find their documentation on using AWS Bedrock as a provider [here](https://docs.litellm.ai/docs/providers/bedrock). +--- + +## Prerequisites + +AWS Bedrock requires the `boto3` library to be installed. This is used internally by LiteLLM - you don't need to import it in your code. + +```bash +pip install boto3>=1.28.57 +``` + +## AWS Bedrock Configuration + +When running OpenHands, you'll need to set AWS credentials using environment variables with `-e` in the docker run command. + +### Authentication Options + +| Method | Environment Variables | +|--------|----------------------| +| Access Keys | `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` | +| Session Token | `AWS_SESSION_TOKEN` (in addition to access keys) | +| AWS Profile | `AWS_PROFILE_NAME` | +| IAM Role | `AWS_ROLE_NAME`, `AWS_WEB_IDENTITY_TOKEN` | +| Bedrock API Key | `AWS_BEARER_TOKEN_BEDROCK` | + +### Example with Access Keys + +```bash +docker run -it --pull=always \ + -e AWS_ACCESS_KEY_ID="your-access-key" \ + -e AWS_SECRET_ACCESS_KEY="your-secret-key" \ + -e AWS_REGION_NAME="us-east-1" \ + ... +``` + +Then in the OpenHands UI Settings under the `LLM` tab: + +1. Enable `Advanced` options. +2. Set the following: + - `Custom Model` to `bedrock/` (e.g., `bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0`) + +## Model Names + +Use the `bedrock/` prefix followed by the Bedrock model ID: + +| Model | Model ID | +|-------|----------| +| Claude 3.5 Sonnet v2 | `bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0` | +| Claude 3 Opus | `bedrock/anthropic.claude-3-opus-20240229-v1:0` | +| Claude 3 Haiku | `bedrock/anthropic.claude-3-haiku-20240307-v1:0` | +| Claude 3.5 Haiku | `bedrock/anthropic.claude-3-5-haiku-20241022-v1:0` | + + +You can find the full list of available Bedrock model IDs in the [AWS Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html) or in the AWS Console under Bedrock > Model access. + + +## Cross-Region Inference + +AWS Bedrock supports cross-region inference for improved availability. To use it, include the region in your model ID: + +``` +bedrock/us.anthropic.claude-3-5-sonnet-20241022-v2:0 +``` + +## Troubleshooting + +### Common Issues + +1. **Access Denied**: Ensure your AWS credentials have the necessary permissions for Bedrock. You need `bedrock:InvokeModel` permission. + +2. **Model Not Found**: Verify that the model is enabled in your AWS account. Go to AWS Console > Bedrock > Model access to enable models. + +3. **Region Issues**: Make sure `AWS_REGION_NAME` is set to a region where Bedrock is available and where you have model access enabled. diff --git a/openhands/usage/llms/llms.mdx b/openhands/usage/llms/llms.mdx index 51263fc1..f866e75a 100644 --- a/openhands/usage/llms/llms.mdx +++ b/openhands/usage/llms/llms.mdx @@ -78,6 +78,7 @@ as environment variables (or add them to your `config.toml`) so the SDK picks th We have a few guides for running OpenHands with specific model providers: +- [AWS Bedrock](/openhands/usage/llms/bedrock-llms) - [Azure](/openhands/usage/llms/azure-llms) - [Google](/openhands/usage/llms/google-llms) - [Groq](/openhands/usage/llms/groq) diff --git a/sdk/arch/llm.mdx b/sdk/arch/llm.mdx index 66feaf46..3fae82f4 100644 --- a/sdk/arch/llm.mdx +++ b/sdk/arch/llm.mdx @@ -294,6 +294,7 @@ verbatim to SDK applications because both layers wrap the same | OpenHands hosted models | [/openhands/usage/llms/openhands-llms](/openhands/usage/llms/openhands-llms) | | OpenAI | [/openhands/usage/llms/openai-llms](/openhands/usage/llms/openai-llms) | | Azure OpenAI | [/openhands/usage/llms/azure-llms](/openhands/usage/llms/azure-llms) | +| AWS Bedrock | [/openhands/usage/llms/bedrock-llms](/openhands/usage/llms/bedrock-llms) | | Google Gemini / Vertex | [/openhands/usage/llms/google-llms](/openhands/usage/llms/google-llms) | | Groq | [/openhands/usage/llms/groq](/openhands/usage/llms/groq) | | OpenRouter | [/openhands/usage/llms/openrouter](/openhands/usage/llms/openrouter) | diff --git a/sdk/guides/llm-provider-bedrock.mdx b/sdk/guides/llm-provider-bedrock.mdx new file mode 100644 index 00000000..a71fa32a --- /dev/null +++ b/sdk/guides/llm-provider-bedrock.mdx @@ -0,0 +1,169 @@ +--- +title: AWS Bedrock +description: Configure the SDK to use Claude and other models via AWS Bedrock. +--- + +> A ready-to-run example is available [here](#ready-to-run-example)! + +Use AWS Bedrock to access Claude and other foundation models through your AWS account. + +## Prerequisites + +AWS Bedrock requires the `boto3` library: + +```bash +pip install openhands-sdk boto3>=1.28.57 +``` + + +`boto3` is used internally by LiteLLM - you don't need to import it in your code. + + +## Authentication Options + +Configure AWS credentials using environment variables: + +| Method | Environment Variables | +|--------|----------------------| +| Access Keys | `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` | +| Session Token | `AWS_SESSION_TOKEN` (in addition to access keys) | +| AWS Profile | `AWS_PROFILE_NAME` | +| IAM Role | `AWS_ROLE_NAME`, `AWS_WEB_IDENTITY_TOKEN` | +| Bedrock API Key | `AWS_BEARER_TOKEN_BEDROCK` | + +You must also set the AWS region: + +```bash +export AWS_REGION_NAME="us-east-1" +``` + +## Model Names + +Use the `bedrock/` prefix followed by the Bedrock model ID: + +| Model | Model ID | +|-------|----------| +| Claude 3.5 Sonnet v2 | `bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0` | +| Claude 3 Opus | `bedrock/anthropic.claude-3-opus-20240229-v1:0` | +| Claude 3 Haiku | `bedrock/anthropic.claude-3-haiku-20240307-v1:0` | +| Claude 3.5 Haiku | `bedrock/anthropic.claude-3-5-haiku-20241022-v1:0` | + +## Basic Usage + +```python icon="python" focus={7} +import os +from openhands.sdk import LLM, Agent, Conversation +from openhands.sdk.tool import Tool +from openhands.tools.terminal import TerminalTool +from openhands.tools.file_editor import FileEditorTool + +llm = LLM(model="bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0") + +agent = Agent( + llm=llm, + tools=[ + Tool(name=TerminalTool.name), + Tool(name=FileEditorTool.name), + ], +) + +conversation = Conversation(agent=agent, workspace=os.getcwd()) +conversation.send_message("List the files in this directory") +conversation.run() +``` + +## Ready-to-run Example + + +Before running, ensure you have: +1. Installed `boto3>=1.28.57` +2. Set AWS credentials via environment variables +3. Enabled the model in your AWS Bedrock console + + +```python icon="python" expandable +""" +AWS Bedrock with Claude models. + +Prerequisites: + pip install openhands-sdk boto3>=1.28.57 + +Environment variables: + AWS_ACCESS_KEY_ID - Your AWS access key + AWS_SECRET_ACCESS_KEY - Your AWS secret key + AWS_REGION_NAME - AWS region (e.g., us-east-1) +""" +import os + +from openhands.sdk import LLM, Agent, Conversation +from openhands.sdk.tool import Tool +from openhands.tools.terminal import TerminalTool +from openhands.tools.file_editor import FileEditorTool + +# AWS credentials are read from environment variables by LiteLLM/boto3 +# AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME + +llm = LLM( + model=os.getenv("LLM_MODEL", "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0"), +) + +agent = Agent( + llm=llm, + tools=[ + Tool(name=TerminalTool.name), + Tool(name=FileEditorTool.name), + ], +) + +conversation = Conversation(agent=agent, workspace=os.getcwd()) +conversation.send_message("List the files in this directory and summarize what you see.") +conversation.run() + +# Report cost +cost = llm.metrics.accumulated_cost +print(f"EXAMPLE_COST: {cost}") +``` + +### Running the Example + +```bash +# Set AWS credentials +export AWS_ACCESS_KEY_ID="your-access-key" +export AWS_SECRET_ACCESS_KEY="your-secret-key" +export AWS_REGION_NAME="us-east-1" + +# Run the example +python bedrock_example.py +``` + +## Cross-Region Inference + +AWS Bedrock supports cross-region inference. Include the region prefix in your model ID: + +```python +llm = LLM(model="bedrock/us.anthropic.claude-3-5-sonnet-20241022-v2:0") +``` + +## Troubleshooting + +### Access Denied + +Ensure your AWS credentials have the `bedrock:InvokeModel` permission. + +### Model Not Found + +Verify the model is enabled in your AWS account: +1. Go to AWS Console > Bedrock > Model access +2. Enable the models you want to use + +### Region Issues + +Ensure `AWS_REGION_NAME` is set to a region where: +- Bedrock is available +- You have model access enabled + +## Next Steps + +- **[LLM Registry](/sdk/guides/llm-registry)** - Manage multiple LLM providers +- **[LLM Routing](/sdk/guides/llm-routing)** - Automatically route to different models +- **[OpenHands Bedrock Guide](/openhands/usage/llms/bedrock-llms)** - Self-hosted OpenHands configuration