Skip to content

Update LiteLLM configuration for hosted_vllm provider#1060

Merged
NathanHB merged 1 commit intohuggingface:mainfrom
abhiram1809:main
Nov 14, 2025
Merged

Update LiteLLM configuration for hosted_vllm provider#1060
NathanHB merged 1 commit intohuggingface:mainfrom
abhiram1809:main

Conversation

@abhiram1809
Copy link
Copy Markdown
Contributor

even though vllm produces openai compatible endpoint, to make work you have to use provider as hosted_vllm and use a hosted_vllm prefix prior to model name

even though vllm produces openai compatible endpoint, to make work you have to use provider as hosted_vllm and use a hosted_vllm prefix prior to model name
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown
Collaborator

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@NathanHB NathanHB merged commit af6b5b4 into huggingface:main Nov 14, 2025
4 checks passed
@NathanHB NathanHB added the bug label Nov 24, 2025
rolshoven pushed a commit to rolshoven/lighteval that referenced this pull request Mar 17, 2026
even though vllm produces openai compatible endpoint, to make work you have to use provider as hosted_vllm and use a hosted_vllm prefix prior to model name
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants