Problem
Today the router requires an exact protocol match between the client SDK and the backend route. If an agent uses the OpenAI SDK (POST /v1/chat/completions), only routes with openai_chat_completions in their protocols list are eligible. Similarly, an Anthropic SDK client can only hit anthropic_messages routes.
This means:
- Route operators must provision backends matching every SDK their agents might use
- Agents are constrained to SDKs that match available backends
- Adding a new backend provider requires all agents to adopt its SDK, or the operator must run parallel routes
Proposed Solution
Add an API translation layer in the router so that a request detected as one protocol (e.g., openai_chat_completions) can be transparently translated and forwarded to a route speaking a different protocol (e.g., anthropic_messages), and the response translated back.
Key design considerations
- Route selection: When no exact-match route exists for the detected protocol, the router should attempt translation to a compatible protocol that does have a route
- Request translation: Convert request body schema between protocols (e.g., OpenAI
messages array → Anthropic messages format, max_tokens semantics, tool/function calling schemas)
- Response translation: Convert response body back to the client's expected schema (e.g., Anthropic response → OpenAI
ChatCompletion shape)
- Streaming: Handle SSE streaming translation (
data: [DONE] vs Anthropic event types)
- Feature parity: Some features don't map 1:1 (e.g., OpenAI
logprobs, Anthropic thinking). Define a strategy for unsupported features (error, best-effort, drop silently)
- Header translation: Already partially exists (auth header rewriting per protocol). Extend to cover other protocol-specific headers (e.g.,
anthropic-version)
Scope of initial translation pairs
At minimum:
openai_chat_completions ↔ anthropic_messages
Nice to have:
openai_responses → other protocols
openai_completions → chat-based protocols
Where this fits
The translation should live in navigator-router, likely as a new module between detect_inference_pattern() (in navigator-sandbox) and proxy_with_candidates(). The router already rewrites headers and model IDs in backend.rs — translation extends this to full schema transformation.
Acceptance Criteria
Problem
Today the router requires an exact protocol match between the client SDK and the backend route. If an agent uses the OpenAI SDK (
POST /v1/chat/completions), only routes withopenai_chat_completionsin theirprotocolslist are eligible. Similarly, an Anthropic SDK client can only hitanthropic_messagesroutes.This means:
Proposed Solution
Add an API translation layer in the router so that a request detected as one protocol (e.g.,
openai_chat_completions) can be transparently translated and forwarded to a route speaking a different protocol (e.g.,anthropic_messages), and the response translated back.Key design considerations
messagesarray → Anthropicmessagesformat,max_tokenssemantics, tool/function calling schemas)ChatCompletionshape)data: [DONE]vs Anthropic event types)logprobs, Anthropicthinking). Define a strategy for unsupported features (error, best-effort, drop silently)anthropic-version)Scope of initial translation pairs
At minimum:
openai_chat_completions↔anthropic_messagesNice to have:
openai_responses→ other protocolsopenai_completions→ chat-based protocolsWhere this fits
The translation should live in
navigator-router, likely as a new module betweendetect_inference_pattern()(innavigator-sandbox) andproxy_with_candidates(). The router already rewrites headers and model IDs inbackend.rs— translation extends this to full schema transformation.Acceptance Criteria