Skip to content

Is there a way to increase Context limit for Github Copilot Provider? #5993

@Steinvorth

Description

@Steinvorth

Question

I had commented a while ago on this issue, and it got solved by the answer found here: https://github.com/sst/opencode/issues/2317?utm_source=chatgpt.com

i will elaborate a little more since maybe it could be more helpful to understand why im asking.

My business only allows us to use GitHub Copilot as the model provider, so we can’t switch to a provider with larger context limits via API keys. When I use OpenCode with the GitHub Copilot provider, the conversation context seems capped at 128 K tokens, even if a model like chatgpt codex allows ~256k tokens.

I want to use OpenCode instead of vscode's chat window, but such a small context window makes it very hard to work on the implementations i use for my coding workflow.

Is there any way to increase the context limit in OpenCode when using the GitHub Copilot provider? For example:

  • Does the CLI solution GitHub released handle context differently in a way that OpenCode could use?
  • Is there any configuration or workaround that lets me expand the effective context beyond 128 K tokens within OpenCode?
Image

thanks for any help and information in advance

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions