fix(opencode): wire OpenAI previous_response_id session caching#20848
fix(opencode): wire OpenAI previous_response_id session caching#20848wildoranges wants to merge 4 commits intoanomalyco:devfrom
Conversation
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
There was a problem hiding this comment.
Pull request overview
This PR wires OpenAI Responses API previous_response_id reuse through OpenCode session state by persisting the latest OpenAI response_id on the session and forwarding it into @ai-sdk/openai request options for subsequent turns.
Changes:
- Add
last_response_idto the session schema + migration, expose it onSession.Info, and persist updates via a newSession.setResponseId. - Capture
openai.responseIdfrom provider metadata at the end of a step and save it to the session; pass it into LLM/provider option transforms aspreviousResponseId. - Remove OpenAI input item ID stripping in the provider fetch path; add transform tests asserting
previousResponseIdbehavior.
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| packages/opencode/test/provider/transform.test.ts | Adds unit tests ensuring previousResponseId is only set for @ai-sdk/openai models. |
| packages/opencode/src/session/session.sql.ts | Adds last_response_id column to the session table definition. |
| packages/opencode/src/session/prompt.ts | Reads session per loop iteration and passes session.lastResponseId into LLM processing. |
| packages/opencode/src/session/processor.ts | Extracts OpenAI responseId from provider metadata on finish-step and persists it to the session. |
| packages/opencode/src/session/llm.ts | Introduces lastResponseId on stream input and forwards it as previousResponseId into provider option transforms. |
| packages/opencode/src/session/index.ts | Adds lastResponseId to session row mapping/schema and introduces setResponseId API. |
| packages/opencode/src/provider/transform.ts | Adds previousResponseId option wiring for @ai-sdk/openai. |
| packages/opencode/src/provider/provider.ts | Removes logic that stripped OpenAI input item IDs from POST bodies. |
| packages/opencode/migration/20260403085443_add_last_response_id/migration.sql | Adds SQLite migration to append last_response_id to session. |
| packages/opencode/migration/20260403085443_add_last_response_id/snapshot.json | Updates migration snapshot to include last_response_id. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const setResponseId = Effect.fn("Session.setResponseId")(function* (input: { | ||
| sessionID: SessionID | ||
| responseId: string | ||
| }) { | ||
| yield* patch(input.sessionID, { lastResponseId: input.responseId }) | ||
| }) |
There was a problem hiding this comment.
Session.setResponseId publishes a Session.Event.Updated patch with lastResponseId, but session updates are persisted via the projector’s toPartialRow(...) mapping. That mapper currently doesn’t include lastResponseId -> last_response_id, so this value will never be written to SQLite (and sessions.get(...) will keep returning undefined). Add the missing field mapping in packages/opencode/src/session/projectors.ts so updates to lastResponseId are persisted.
There was a problem hiding this comment.
Thanks for catching this — fixed in 30dc759. I added last_response_id: grab(info, "lastResponseId") in packages/opencode/src/session/projectors.ts, so Session.Event.Updated patches now persist lastResponseId to SQLite.
|
Follow-up update: I pushed one additional cache-hit fix in f1247c5. Besides persisting/reusing |
|
Follow-up commit pushed: This addresses a Copilot compatibility regression: we now stop sending
|
1 similar comment
|
Follow-up commit pushed: This addresses a Copilot compatibility regression: we now stop sending
|
Issue for this PR
Closes #20847
Type of change
What does this PR do?
This PR fixes OpenAI Responses cache continuity in two parts:
previous_response_idacross session turns.promptCacheKeyis set not only forproviderID === "openai", but also for OpenAI SDK-based providers (@ai-sdk/openai) and GitHub Copilot SDK (@ai-sdk/github-copilot).Without (2), custom OpenAI providers can still miss cache routing even when
previous_response_idis present.I also added provider transform tests to lock this behavior.
How did you verify your code works?
bun test test/provider/transform.test.tsbun test test/session/llm.test.tsbun typecheck(including pre-push turbo typecheck)Screenshots / recordings
N/A (non-UI change)
Checklist