Skip to content

Openai compatibility: Fetch and use from /models endpoint (and some other features) #1

@tnglemongrass

Description

@tnglemongrass
  • User might have OPENAI_API_BASE and OPENAI_API_KEY given in their envs. Those should be used.
  • and /models (or however you can get to that list) should show models from OPENAI_API_BASE/models and the chat should use OPENAI_API_BASE/chat/completions.
  • It would be quite nice to give those via cli args as well, as a starting point e.g. opencode --api-base https://api.openai.com/v1 --api-key your_api_key --model gpt-4.1

e..g OPENAI_API_BASE=https://api.openai.com/v1 or OPENAI_API_BASE=https://api.openai.com/v1/

This would also be quite useful to connect against local providers like ollama or koboldcpp or LMStudio.

Metadata

Metadata

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions