Besides ollama, I run LM Studio which has URL http://localhost:1234/v1/chat/completions This allows me to run MLX models. Can you support this URL? An example of something that supports it: Brave's Leo AI
Besides ollama, I run LM Studio which has URL http://localhost:1234/v1/chat/completions
This allows me to run MLX models.
Can you support this URL?
An example of something that supports it: Brave's Leo AI