Skip to content

fix(embedding): skip warmup for remote providers#89

Closed
gabrielrinaldi wants to merge 1 commit intotickernelz:mainfrom
gabrielrinaldi:fix/remote-embeddings-warmup
Closed

fix(embedding): skip warmup for remote providers#89
gabrielrinaldi wants to merge 1 commit intotickernelz:mainfrom
gabrielrinaldi:fix/remote-embeddings-warmup

Conversation

@gabrielrinaldi
Copy link
Copy Markdown

Summary

  • skip local embedding warmup when embeddingApiUrl and embeddingApiKey are configured
  • avoid loading the local @xenova/transformers stack for remote embedding setups
  • preserve existing behavior for local embedding models

Why

OpenCode installs plugins with bun add --ignore-scripts, which can leave native transitive dependencies unresolved in some environments. For opencode-mem, remote embedding mode does not need local transformer warmup, so eagerly warming that path can fail unnecessarily during plugin startup.

This change keeps remote embedding mode on the remote path from the start, which avoids startup failures while leaving local embedding behavior unchanged.

Copilot AI review requested due to automatic review settings April 5, 2026 21:29
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adjusts embedding initialization so that when an external (OpenAI-compatible) embeddings API is configured, the service skips local model warmup and avoids loading the @xenova/transformers stack, preventing unnecessary startup failures in remote-embedding setups.

Changes:

  • Short-circuit EmbeddingService.warmup() when embeddingApiUrl and embeddingApiKey are set (remote mode).
  • Update bun.lock metadata (configVersion).

Reviewed changes

Copilot reviewed 1 out of 2 changed files in this pull request and generated no comments.

File Description
src/services/embedding.ts Skips local embedding warmup for remote providers to avoid loading local transformer dependencies unnecessarily.
bun.lock Lockfile metadata update (configVersion).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@NaNomicon
Copy link
Copy Markdown
Collaborator

Thanks for digging into this. From tracing the current code, I think the remote warmup short-circuit may already exist today.

warmup() delegates to initializeModel(), and initializeModel() already returns early before loading @xenova/transformers when both embeddingApiUrl and embeddingApiKey are set. So I’m not yet seeing the failing path this PR changes, aside from warmup() no longer setting initPromise in the remote case.

Could you share the concrete repro or stack trace this fixes, or add a regression test showing current startup still reaches the local transformer path in remote mode? If there is an existing bug here, I think we should target that path directly rather than duplicate the remote-mode check.

@gabrielrinaldi
Copy link
Copy Markdown
Author

I think #90 is a more elegant solutions, closing this

@gabrielrinaldi gabrielrinaldi deleted the fix/remote-embeddings-warmup branch April 12, 2026 05:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants