-
Notifications
You must be signed in to change notification settings - Fork 80
Open
Description
The example programs here for Llama3 and Phi3 use low level prompt templates with <|user|> tags etc., but I don't think that really makes sense. The OllamaFunctions object will generate and prepend a System message ahead of what you send containing the tool instructions, and your prompt will be sent as a User message (as the 2nd item in the conversation JSON).
Ultimately, Ollama itself already knows the low level prompt formats due to the templates contained in its Modelfiles (example), and will wrap the content in appropriate <|user|> etc. markup, leading to your user prompt being double-wrapped as it is presented to the underlying LLM.
P.S., love your YT channel! :-)
Metadata
Metadata
Assignees
Labels
No labels