This needs a solutions integration and social media image. It doesn't use auth by default so it doesn't need a connected service.
start:
llm.agent:
inputs:
llm:
openai:
api_endpoint_url: http://model-runner.docker.internal/engines
model: ai/qwen2.5:7B-Q4_K_M
#...
See: https://docs.docker.com/desktop/features/model-runner/
This needs a solutions integration and social media image. It doesn't use auth by default so it doesn't need a connected service.
You use
http://model-runner.docker.internal/when connecting between Docker containers on the same host. It supports the OpenAI API spec.From
llm.agent: