From 79f881b39fd24f6e9b56107e51204ef0ed0c2154 Mon Sep 17 00:00:00 2001 From: amankalra172 Date: Thu, 12 Feb 2026 19:17:27 +0100 Subject: [PATCH 1/2] docs: add STACKIT AI Model Serving provider documentation - Add comprehensive setup instructions for STACKIT AI Model Serving - Include authentication via STACKIT Portal - Document available text and embedding models with specifications - Add custom configuration example for opencode.json - Note OpenAI-compatible API and rate limits --- packages/web/src/content/docs/providers.mdx | 88 +++++++++++++++++++++ 1 file changed, 88 insertions(+) diff --git a/packages/web/src/content/docs/providers.mdx b/packages/web/src/content/docs/providers.mdx index e7befcf026cf..d549728d1333 100644 --- a/packages/web/src/content/docs/providers.mdx +++ b/packages/web/src/content/docs/providers.mdx @@ -1478,6 +1478,94 @@ SAP AI Core provides access to 40+ models from OpenAI, Anthropic, Google, Amazon --- +### STACKIT + +STACKIT AI Model Serving provides fully managed hosting environment for AI models, focusing on large language models (LLMs) like Llama, Mistral, and Qwen, with maximum data sovereignty on European infrastructure. + +1. Head over to [STACKIT Portal](https://portal.stackit.cloud), navigate to **AI Model Serving**, and create an auth token for your project. + + :::tip + You need a STACKIT customer account, user account, and project before creating auth tokens. + ::: + +2. Run the `/connect` command and search for **STACKIT**. + + ```txt + /connect + ``` + +3. Enter your STACKIT AI Model Serving auth token. + + ```txt + ┌ API key + │ + │ + └ enter + ``` + +4. Run the `/models` command to select from available models like _Qwen3-VL 235B_ or _Llama 3.3 70B_. + + ```txt + /models + ``` + +#### Available Models + +**Text Models:** + +- **Qwen3-VL 235B** - Vision-language model with 218K context, multimodal input +- **Llama 3.3 70B** - General purpose LLM with 128K context, tool calling enabled +- **GPT-OSS 120B** - Strong reasoning model with 131K context, tool calling enabled +- **Gemma 3 27B** - Multimodal model with 37K context, 140+ languages +- **Mistral-Nemo** - Multilingual LLM with 128K context, optimized for commercial use +- **Llama 3.1 8B** - Efficient model with 128K context, tool calling enabled + +**Embedding Models:** + +- **E5 Mistral 7B** - Text embedding model (4096 dimensions) +- **Qwen3 Vision-Language Embedding** - Multimodal embedding model for text and images + +:::note +All models use OpenAI-compatible API endpoints. Rate limits apply: 200,000 TPM and 30-80 RPM for chat models, 600 RPM for embedding models. +::: + +#### Custom Configuration + +You can also configure STACKIT manually in your `opencode.json`: + +```json title="opencode.json" "stackit" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "stackit": { + "npm": "@ai-sdk/openai-compatible", + "name": "STACKIT AI Model Serving", + "options": { + "baseURL": "https://api.openai-compat.model-serving.eu01.onstackit.cloud/v1" + }, + "models": { + "qwen3-vl-235b": { + "name": "Qwen3-VL 235B", + "limit": { + "context": 218000, + "output": 65536 + } + }, + "llama-33-70b": { + "name": "Llama 3.3 70B", + "limit": { + "context": 128000, + "output": 65536 + } + } + } + } + } +} +``` + +--- + ### OVHcloud AI Endpoints 1. Head over to the [OVHcloud panel](https://ovh.com/manager). Navigate to the `Public Cloud` section, `AI & Machine Learning` > `AI Endpoints` and in `API Keys` tab, click **Create a new API key**. From 619b86a9ee82004fa6f647ddb1dff6d647b5b9d9 Mon Sep 17 00:00:00 2001 From: amankalra172 Date: Thu, 12 Feb 2026 19:25:46 +0100 Subject: [PATCH 2/2] docs: fix typo and simplify STACKIT provider documentation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Fix typo: 'soverign' → 'sovereign' - Remove detailed models and configuration sections for cleaner documentation - Keep basic setup instructions for STACKIT AI Model Serving --- packages/web/src/content/docs/providers.mdx | 58 +-------------------- 1 file changed, 1 insertion(+), 57 deletions(-) diff --git a/packages/web/src/content/docs/providers.mdx b/packages/web/src/content/docs/providers.mdx index d549728d1333..7e0ee1a2dd7e 100644 --- a/packages/web/src/content/docs/providers.mdx +++ b/packages/web/src/content/docs/providers.mdx @@ -1480,7 +1480,7 @@ SAP AI Core provides access to 40+ models from OpenAI, Anthropic, Google, Amazon ### STACKIT -STACKIT AI Model Serving provides fully managed hosting environment for AI models, focusing on large language models (LLMs) like Llama, Mistral, and Qwen, with maximum data sovereignty on European infrastructure. +STACKIT AI Model Serving provides fully managed soverign hosting environment for AI models, focusing on LLMs like Llama, Mistral, and Qwen, with maximum data sovereignty on European infrastructure. 1. Head over to [STACKIT Portal](https://portal.stackit.cloud), navigate to **AI Model Serving**, and create an auth token for your project. @@ -1508,62 +1508,6 @@ STACKIT AI Model Serving provides fully managed hosting environment for AI model ```txt /models ``` - -#### Available Models - -**Text Models:** - -- **Qwen3-VL 235B** - Vision-language model with 218K context, multimodal input -- **Llama 3.3 70B** - General purpose LLM with 128K context, tool calling enabled -- **GPT-OSS 120B** - Strong reasoning model with 131K context, tool calling enabled -- **Gemma 3 27B** - Multimodal model with 37K context, 140+ languages -- **Mistral-Nemo** - Multilingual LLM with 128K context, optimized for commercial use -- **Llama 3.1 8B** - Efficient model with 128K context, tool calling enabled - -**Embedding Models:** - -- **E5 Mistral 7B** - Text embedding model (4096 dimensions) -- **Qwen3 Vision-Language Embedding** - Multimodal embedding model for text and images - -:::note -All models use OpenAI-compatible API endpoints. Rate limits apply: 200,000 TPM and 30-80 RPM for chat models, 600 RPM for embedding models. -::: - -#### Custom Configuration - -You can also configure STACKIT manually in your `opencode.json`: - -```json title="opencode.json" "stackit" {5, 6, 8, 10-14} -{ - "$schema": "https://opencode.ai/config.json", - "provider": { - "stackit": { - "npm": "@ai-sdk/openai-compatible", - "name": "STACKIT AI Model Serving", - "options": { - "baseURL": "https://api.openai-compat.model-serving.eu01.onstackit.cloud/v1" - }, - "models": { - "qwen3-vl-235b": { - "name": "Qwen3-VL 235B", - "limit": { - "context": 218000, - "output": 65536 - } - }, - "llama-33-70b": { - "name": "Llama 3.3 70B", - "limit": { - "context": 128000, - "output": 65536 - } - } - } - } - } -} -``` - --- ### OVHcloud AI Endpoints