Skip to content

Python: Verify local models in Ollama and LM Studio are compatible with the OpenAI connector#6973

Merged
TaoChenOSU merged 15 commits intomicrosoft:mainfrom
TaoChenOSU:taochen/local-models-with-openai-connector-2
Jul 5, 2024
Merged

Python: Verify local models in Ollama and LM Studio are compatible with the OpenAI connector#6973
TaoChenOSU merged 15 commits intomicrosoft:mainfrom
TaoChenOSU:taochen/local-models-with-openai-connector-2

Conversation

@TaoChenOSU
Copy link
Copy Markdown
Contributor

Motivation and Context

Related to #6498

The use of local models presents a twofold benefit for developers: increased flexibility and reduced costs. Ollama and LM Studio are two well-known platforms that facilitate the hosting of models locally, both of which offer compatibility with OpenAI endpoints. As such, it is imperative that our OpenAI connector functions correctly when users are operating models on these platforms.

Description

  1. Verify that our OpenAI connector works as expected with models hosted locally using Ollama and LM Studio.
  2. Create three new samples (Ollama/chat, LM Studio/chat, LM Studio/Embedding) under /concepts/local_models to show how to using local models with the OpenAI connector.
  3. Fix a bug in test_sample_utils.py where if a test case is retried and input was never reset.

Contribution Checklist

@TaoChenOSU TaoChenOSU self-assigned this Jun 26, 2024
@TaoChenOSU TaoChenOSU requested a review from a team as a code owner June 26, 2024 20:59
@markwallace-microsoft markwallace-microsoft added the python Pull requests for the Python Semantic Kernel label Jun 26, 2024
@github-actions github-actions Bot changed the title Verify local models in Ollama and LM Studio are compatible with the OpenAI connector Python: Verify local models in Ollama and LM Studio are compatible with the OpenAI connector Jun 26, 2024
@markwallace-microsoft
Copy link
Copy Markdown
Contributor

markwallace-microsoft commented Jun 26, 2024

Py3.10 Test Coverage

Python 3.10 Test Coverage Report •
FileStmtsMissCoverMissing
semantic_kernel/connectors/ai/open_ai/services
   open_ai_chat_completion.py23196%61
TOTAL681476789% 

Python 3.10 Unit Test Overview

Tests Skipped Failures Errors Time
1598 1 💤 0 ❌ 0 🔥 26.351s ⏱️

Copy link
Copy Markdown
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small note, let's make it so that you can just use the Completion service directly instead of having to create your own client!

Comment thread python/samples/concepts/local_models/lm_studio_chat_completion.py
Comment thread python/samples/concepts/local_models/ollama_chat_completion.py
Comment thread python/samples/concepts/local_models/ollama_chat_completion.py
Copy link
Copy Markdown

@AndreasKunar AndreasKunar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still fails for me because of the missing API-key parameter in line 39 (see comment for details on the validation error).

Comment thread python/samples/concepts/local_models/ollama_chat_completion.py
@TaoChenOSU TaoChenOSU enabled auto-merge July 5, 2024 17:47
@TaoChenOSU TaoChenOSU added this pull request to the merge queue Jul 5, 2024
@github-merge-queue github-merge-queue Bot removed this pull request from the merge queue due to failed status checks Jul 5, 2024
@TaoChenOSU TaoChenOSU added this pull request to the merge queue Jul 5, 2024
@github-merge-queue github-merge-queue Bot removed this pull request from the merge queue due to failed status checks Jul 5, 2024
@TaoChenOSU TaoChenOSU added this pull request to the merge queue Jul 5, 2024
@github-merge-queue github-merge-queue Bot removed this pull request from the merge queue due to failed status checks Jul 5, 2024
@TaoChenOSU TaoChenOSU added this pull request to the merge queue Jul 5, 2024
@TaoChenOSU TaoChenOSU removed this pull request from the merge queue due to a manual request Jul 5, 2024
@TaoChenOSU TaoChenOSU added this pull request to the merge queue Jul 5, 2024
Merged via the queue into microsoft:main with commit 5779b7d Jul 5, 2024
@TaoChenOSU TaoChenOSU deleted the taochen/local-models-with-openai-connector-2 branch July 5, 2024 22:41
LudoCorporateShark pushed a commit to LudoCorporateShark/semantic-kernel that referenced this pull request Aug 25, 2024
…th the OpenAI connector (microsoft#6973)

### Motivation and Context

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->
Related to microsoft#6498

The use of local models presents a twofold benefit for developers:
increased flexibility and reduced costs. Ollama and LM Studio are two
well-known platforms that facilitate the hosting of models locally, both
of which offer compatibility with OpenAI endpoints. As such, it is
imperative that our OpenAI connector functions correctly when users are
operating models on these platforms.

### Description

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
1. Verify that our OpenAI connector works as expected with models hosted
locally using Ollama and LM Studio.
2. Create three new samples (Ollama/chat, LM Studio/chat, LM
Studio/Embedding) under `/concepts/local_models` to show how to using
local models with the OpenAI connector.
3. Fix a bug in `test_sample_utils.py` where if a test case is retried
and input was never reset.

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation python Pull requests for the Python Semantic Kernel

Projects

Archived in project

6 participants