Skip to content

Redundant Prompt Formatting(?) #1

@nathanic

Description

@nathanic

The example programs here for Llama3 and Phi3 use low level prompt templates with <|user|> tags etc., but I don't think that really makes sense. The OllamaFunctions object will generate and prepend a System message ahead of what you send containing the tool instructions, and your prompt will be sent as a User message (as the 2nd item in the conversation JSON).

Ultimately, Ollama itself already knows the low level prompt formats due to the templates contained in its Modelfiles (example), and will wrap the content in appropriate <|user|> etc. markup, leading to your user prompt being double-wrapped as it is presented to the underlying LLM.

P.S., love your YT channel! :-)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions