Skip to content

[Feature] Add ai model router #67

@maximedogawa

Description

@maximedogawa

User Story

As a user
I want to switch ai model on demand
So that so that for normal task I can use modal models or models for a specific task or it I need fast replies and deep thinking.

Description

I want to add a native ai model switcher via messages so that the user can decide which model he likes to use for the reply of a message. This should be like keywords "use model" or "switch to" in many languages. this keywords should be accessed via the keyword collection library and also given back as command when ask for keyword to use and info.

Pengine should change the model that is available in Ollama list and give a reply with model is now active. For cron job default to local models because they can use longer reply time and not using any api token or usage and run locally.

This should be the basis of the ai model router that decides also with model he can use. for example when I request to generate a image switch the model for ai image generation and switch back after the reply to default local model or model defined in the ui.

there should also be a possibility to use more than one ai model per reply. For that feature model caching should be improved what is out of scope for this ticket and will be implemented in this ticket #49

Acceptance Criteria

  • add ai model router framework and a native tool
  • add ai model switching keywords
  • add multiple ai model per request
  • make the ai model switching efficient and fast
  • make this a protocol that mcp server can use easily

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions