Skip to content

feat: add serviceTier parameter to Responses API LLM#1206

Merged
toubatbrian merged 1 commit intolivekit:mainfrom
piyush-gambhir:feat/responses-service-tier-param
Apr 6, 2026
Merged

feat: add serviceTier parameter to Responses API LLM#1206
toubatbrian merged 1 commit intolivekit:mainfrom
piyush-gambhir:feat/responses-service-tier-param

Conversation

@piyush-gambhir
Copy link
Copy Markdown
Contributor

Summary

Mirrors livekit/agents#5342 for the JavaScript/TypeScript SDK.

The Chat Completions LLM already supports serviceTier for per-request processing tier control. The Responses API LLM is missing this parameter despite the OpenAI API supporting it.

Changes

plugins/openai/src/responses/llm.ts (1 file):

  • Add serviceTier?: string to LLMOptions interface
  • Pass service_tier through in chat() via modelOptions

Usage

import { responses } from '@livekit/openai-plugin';

const llm = new responses.LLM({
  model: 'gpt-5.4',
  serviceTier: 'priority',  // now supported
});

Backward Compatible

  • Optional field, defaults to undefined
  • Matches existing Chat Completions LLM pattern

@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 4, 2026

🦋 Changeset detected

Latest commit: bdcd1d4

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 22 packages
Name Type
@livekit/agents-plugin-openai Patch
@livekit/agents-plugin-anam Patch
@livekit/agents-plugin-cartesia Patch
@livekit/agents-plugin-deepgram Patch
@livekit/agents-plugin-elevenlabs Patch
@livekit/agents-plugin-google Patch
@livekit/agents-plugin-inworld Patch
@livekit/agents-plugin-neuphonic Patch
@livekit/agents-plugin-resemble Patch
@livekit/agents-plugin-rime Patch
@livekit/agents-plugin-sarvam Patch
@livekit/agents-plugin-xai Patch
@livekit/agents Patch
@livekit/agents-plugin-baseten Patch
@livekit/agents-plugin-bey Patch
@livekit/agents-plugin-hedra Patch
@livekit/agents-plugin-lemonslice Patch
@livekit/agents-plugin-livekit Patch
@livekit/agents-plugin-phonic Patch
@livekit/agents-plugin-silero Patch
@livekit/agents-plugin-trugen Patch
@livekit/agents-plugins-test Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

devin-ai-integration[bot]

This comment was marked as resolved.

@piyush-gambhir piyush-gambhir force-pushed the feat/responses-service-tier-param branch from f273151 to 4a33a88 Compare April 4, 2026 23:36
The Chat Completions LLM already supports serviceTier for configuring
priority/flex/default processing. This adds the same parameter to the
Responses API LLM for parity.

Changes (plugins/openai/src/responses/llm.ts only):
- Add serviceTier?: string to LLMOptions interface
- Pass service_tier through in chat() via modelOptions
@piyush-gambhir piyush-gambhir force-pushed the feat/responses-service-tier-param branch from 4a33a88 to bdcd1d4 Compare April 4, 2026 23:44
@toubatbrian toubatbrian merged commit bf6811f into livekit:main Apr 6, 2026
6 checks passed
@github-actions github-actions Bot mentioned this pull request Apr 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants