-
Notifications
You must be signed in to change notification settings - Fork 170
Description
Description
Help us add the Amazon Bedrock provider to our selection of local providers!
Local providers are providers where users supply their own API keys to access models directly. There are also MCPJam provided models that run on our own infrastructure. The OpenAI provider is a good example of how a provider can be both. This issue focuses only on local providers
How to fix
Provider Information
Provider Name: Amazon Bedrock
AI SDK Package: @ai-sdk/amazon-bedrock
Documentation: https://ai-sdk.dev/providers/ai-sdk-providers/amazon-bedrock
- Add Package Dependency
- Add the AI SDK provider package to
server/package.json - Run
npm install --legacy-peer-depsin theserver/directory
- Update Chat Helpers
- Import the provider SDK in
server/utils/chat-helpers.ts - Add a new case to the
createLlmModelfunction switch statement
- Update Type Definitions
- Add provider to
ModelProvidertype inshared/types.ts
- Add Model Definitions
- Add model IDs to the
Modelenum inshared/types.ts - Add models to
SUPPORTED_MODELSarray inshared/types.ts
- Testing
- Test model selection in the UI
- Test chat functionality with the new provider
- Verify API key validation works
- Verify all listed models work correctly
Additional Configuration (if needed)
There's some providers like Ollama or LiteLLM that will require additional steps. Please check #653 for guidance
Reference Implementation
See #701 for Mistral implementation as a reference example.
Relevant files changed:
server/package.json- Added dependencyserver/utils/chat-helpers.ts- Added provider integrationshared/types.ts- Added type definitions and models
First time contributing?
Please checkout our CONTRIBUTIONS.md for instructions on how to set up as a contributor.