Skip to content

fix(opencode): wire OpenAI previous_response_id session caching#20848

Open
wildoranges wants to merge 4 commits intoanomalyco:devfrom
wildoranges:feat/gpt-incremental-caching
Open

fix(opencode): wire OpenAI previous_response_id session caching#20848
wildoranges wants to merge 4 commits intoanomalyco:devfrom
wildoranges:feat/gpt-incremental-caching

Conversation

@wildoranges
Copy link
Copy Markdown

@wildoranges wildoranges commented Apr 3, 2026

Issue for this PR

Closes #20847

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

This PR fixes OpenAI Responses cache continuity in two parts:

  1. Persist and reuse previous_response_id across session turns.
  2. Ensure promptCacheKey is set not only for providerID === "openai", but also for OpenAI SDK-based providers (@ai-sdk/openai) and GitHub Copilot SDK (@ai-sdk/github-copilot).

Without (2), custom OpenAI providers can still miss cache routing even when previous_response_id is present.

I also added provider transform tests to lock this behavior.

How did you verify your code works?

  • bun test test/provider/transform.test.ts
  • bun test test/session/llm.test.ts
  • bun typecheck (including pre-push turbo typecheck)

Screenshots / recordings

N/A (non-UI change)

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

Copilot AI review requested due to automatic review settings April 3, 2026 09:16
@github-actions github-actions bot added the needs:compliance This means the issue will auto-close after 2 hours. label Apr 3, 2026
@github-actions github-actions bot removed the needs:compliance This means the issue will auto-close after 2 hours. label Apr 3, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 3, 2026

Thanks for updating your PR! It now meets our contributing guidelines. 👍

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR wires OpenAI Responses API previous_response_id reuse through OpenCode session state by persisting the latest OpenAI response_id on the session and forwarding it into @ai-sdk/openai request options for subsequent turns.

Changes:

  • Add last_response_id to the session schema + migration, expose it on Session.Info, and persist updates via a new Session.setResponseId.
  • Capture openai.responseId from provider metadata at the end of a step and save it to the session; pass it into LLM/provider option transforms as previousResponseId.
  • Remove OpenAI input item ID stripping in the provider fetch path; add transform tests asserting previousResponseId behavior.

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
packages/opencode/test/provider/transform.test.ts Adds unit tests ensuring previousResponseId is only set for @ai-sdk/openai models.
packages/opencode/src/session/session.sql.ts Adds last_response_id column to the session table definition.
packages/opencode/src/session/prompt.ts Reads session per loop iteration and passes session.lastResponseId into LLM processing.
packages/opencode/src/session/processor.ts Extracts OpenAI responseId from provider metadata on finish-step and persists it to the session.
packages/opencode/src/session/llm.ts Introduces lastResponseId on stream input and forwards it as previousResponseId into provider option transforms.
packages/opencode/src/session/index.ts Adds lastResponseId to session row mapping/schema and introduces setResponseId API.
packages/opencode/src/provider/transform.ts Adds previousResponseId option wiring for @ai-sdk/openai.
packages/opencode/src/provider/provider.ts Removes logic that stripped OpenAI input item IDs from POST bodies.
packages/opencode/migration/20260403085443_add_last_response_id/migration.sql Adds SQLite migration to append last_response_id to session.
packages/opencode/migration/20260403085443_add_last_response_id/snapshot.json Updates migration snapshot to include last_response_id.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +593 to +598
const setResponseId = Effect.fn("Session.setResponseId")(function* (input: {
sessionID: SessionID
responseId: string
}) {
yield* patch(input.sessionID, { lastResponseId: input.responseId })
})
Copy link

Copilot AI Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Session.setResponseId publishes a Session.Event.Updated patch with lastResponseId, but session updates are persisted via the projector’s toPartialRow(...) mapping. That mapper currently doesn’t include lastResponseId -> last_response_id, so this value will never be written to SQLite (and sessions.get(...) will keep returning undefined). Add the missing field mapping in packages/opencode/src/session/projectors.ts so updates to lastResponseId are persisted.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching this — fixed in 30dc759. I added last_response_id: grab(info, "lastResponseId") in packages/opencode/src/session/projectors.ts, so Session.Event.Updated patches now persist lastResponseId to SQLite.

@wildoranges
Copy link
Copy Markdown
Author

wildoranges commented Apr 3, 2026

Follow-up update: I pushed one additional cache-hit fix in f1247c5.

Besides persisting/reusing previous_response_id, the branch now also applies promptCacheKey for SDK-based OpenAI providers (@ai-sdk/openai and @ai-sdk/github-copilot), which was the missing piece for my custom OpenAI provider setup.

@wildoranges
Copy link
Copy Markdown
Author

Follow-up commit pushed: a064a2f66.

This addresses a Copilot compatibility regression: we now stop sending previousResponseId on @ai-sdk/github-copilot requests because that backend returns previous_response_id is not supported.

previousResponseId remains enabled for @ai-sdk/openai, and existing promptCacheKey logic is unchanged.

1 similar comment
@wildoranges
Copy link
Copy Markdown
Author

Follow-up commit pushed: a064a2f66.

This addresses a Copilot compatibility regression: we now stop sending previousResponseId on @ai-sdk/github-copilot requests because that backend returns previous_response_id is not supported.

previousResponseId remains enabled for @ai-sdk/openai, and existing promptCacheKey logic is unchanged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: OpenAI responses API does not send previous_response_id in multi-turn sessions

2 participants