-
Notifications
You must be signed in to change notification settings - Fork 4.6k
.Net: Add Ollama Connector #7362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
22 commits
Select commit
Hold shift + click to select a range
065e8f6
.Net Ollama Connector with Ollama Sharp Client Update (#7059)
rogerbarreto 2def240
Python: .Net Ollama (Merge main) (#7231)
rogerbarreto 02c7dfa
Revert "Python: .Net Ollama (Merge main)" (#7232)
rogerbarreto 5bef1a9
.Net: Ollama Connector : Added metadata, integration tests + more adj…
rogerbarreto 3e8853e
.Net: Ollama - Adding metadata to chat messages (#7249)
rogerbarreto 3d8078f
Merge branch 'main' into feature-connectors-ollama
rogerbarreto d710e75
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 067a62e
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 4d46c28
Merge branch 'main' of https://github.com/microsoft/semantic-kernel i…
rogerbarreto 5244078
Fix conflict
rogerbarreto 071e5f9
.Net: Ollama Connector - Embeddings + Latest OllamaSharp update. (#8095)
rogerbarreto fa4ee99
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 16453bd
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 6ee36b8
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 3b1d2dd
.Net: Ollama - Adding Missing Samples (#8309)
rogerbarreto 546d30f
.Net: Ollama Concept Test Fix (#8314)
rogerbarreto fb100b9
Merge branch 'main' into feature-connectors-ollama
rogerbarreto b7344ef
Merge branch 'main' into feature-connectors-ollama
rogerbarreto e7731bf
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 8aa612a
.Net: Ollama Address PR Feedback (#8587)
rogerbarreto 78ba626
Merge branch 'main' into feature-connectors-ollama
rogerbarreto 5d8cc91
Add missing XmlDoc
rogerbarreto File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
73 changes: 73 additions & 0 deletions
73
dotnet/samples/Concepts/ChatCompletion/Ollama_ChatCompletion.cs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,73 @@ | ||
| // Copyright (c) Microsoft. All rights reserved. | ||
|
|
||
| using System.Text; | ||
| using Microsoft.SemanticKernel; | ||
| using Microsoft.SemanticKernel.ChatCompletion; | ||
| using Microsoft.SemanticKernel.Connectors.Ollama; | ||
|
|
||
| namespace ChatCompletion; | ||
|
|
||
| // The following example shows how to use Semantic Kernel with Ollama Chat Completion API | ||
| public class Ollama_ChatCompletion(ITestOutputHelper output) : BaseTest(output) | ||
| { | ||
| [Fact] | ||
| public async Task ServicePromptAsync() | ||
| { | ||
| Assert.NotNull(TestConfiguration.Ollama.ModelId); | ||
|
|
||
| Console.WriteLine("======== Ollama - Chat Completion ========"); | ||
|
|
||
| var chatService = new OllamaChatCompletionService( | ||
| endpoint: new Uri(TestConfiguration.Ollama.Endpoint), | ||
| modelId: TestConfiguration.Ollama.ModelId); | ||
|
|
||
| Console.WriteLine("Chat content:"); | ||
| Console.WriteLine("------------------------"); | ||
|
|
||
| var chatHistory = new ChatHistory("You are a librarian, expert about books"); | ||
|
|
||
| // First user message | ||
| chatHistory.AddUserMessage("Hi, I'm looking for book suggestions"); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // First assistant message | ||
| var reply = await chatService.GetChatMessageContentAsync(chatHistory); | ||
| chatHistory.Add(reply); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // Second user message | ||
| chatHistory.AddUserMessage("I love history and philosophy, I'd like to learn something new about Greece, any suggestion"); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // Second assistant message | ||
| reply = await chatService.GetChatMessageContentAsync(chatHistory); | ||
| chatHistory.Add(reply); | ||
| this.OutputLastMessage(chatHistory); | ||
| } | ||
|
|
||
| [Fact] | ||
| public async Task ChatPromptAsync() | ||
| { | ||
| Assert.NotNull(TestConfiguration.Ollama.ModelId); | ||
|
|
||
| StringBuilder chatPrompt = new(""" | ||
| <message role="system">You are a librarian, expert about books</message> | ||
| <message role="user">Hi, I'm looking for book suggestions</message> | ||
| """); | ||
|
|
||
| var kernel = Kernel.CreateBuilder() | ||
| .AddOllamaChatCompletion( | ||
| endpoint: new Uri(TestConfiguration.Ollama.Endpoint ?? "http://localhost:11434"), | ||
| modelId: TestConfiguration.Ollama.ModelId) | ||
| .Build(); | ||
|
|
||
| var reply = await kernel.InvokePromptAsync(chatPrompt.ToString()); | ||
|
|
||
| chatPrompt.AppendLine($"<message role=\"assistant\"><![CDATA[{reply}]]></message>"); | ||
| chatPrompt.AppendLine("<message role=\"user\">I love history and philosophy, I'd like to learn something new about Greece, any suggestion</message>"); | ||
|
|
||
| reply = await kernel.InvokePromptAsync(chatPrompt.ToString()); | ||
|
|
||
| Console.WriteLine(reply); | ||
| } | ||
| } |
161 changes: 161 additions & 0 deletions
161
dotnet/samples/Concepts/ChatCompletion/Ollama_ChatCompletionStreaming.cs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,161 @@ | ||
| // Copyright (c) Microsoft. All rights reserved. | ||
|
|
||
| using System.Text; | ||
| using Microsoft.SemanticKernel; | ||
| using Microsoft.SemanticKernel.ChatCompletion; | ||
| using Microsoft.SemanticKernel.Connectors.Ollama; | ||
|
|
||
| namespace ChatCompletion; | ||
|
|
||
| /// <summary> | ||
| /// These examples demonstrate the ways different content types are streamed by Ollama via the chat completion service. | ||
| /// </summary> | ||
| public class Ollama_ChatCompletionStreaming(ITestOutputHelper output) : BaseTest(output) | ||
| { | ||
| /// <summary> | ||
| /// This example demonstrates chat completion streaming using Ollama. | ||
| /// </summary> | ||
| [Fact] | ||
| public Task StreamChatAsync() | ||
| { | ||
| Assert.NotNull(TestConfiguration.Ollama.ModelId); | ||
|
|
||
| Console.WriteLine("======== Ollama - Chat Completion Streaming ========"); | ||
|
|
||
| var chatService = new OllamaChatCompletionService( | ||
| endpoint: new Uri(TestConfiguration.Ollama.Endpoint), | ||
| modelId: TestConfiguration.Ollama.ModelId); | ||
|
|
||
| return this.StartStreamingChatAsync(chatService); | ||
| } | ||
|
|
||
| [Fact] | ||
| public async Task StreamChatPromptAsync() | ||
| { | ||
| Assert.NotNull(TestConfiguration.Ollama.ModelId); | ||
|
|
||
| StringBuilder chatPrompt = new(""" | ||
| <message role="system">You are a librarian, expert about books</message> | ||
| <message role="user">Hi, I'm looking for book suggestions</message> | ||
| """); | ||
|
|
||
| var kernel = Kernel.CreateBuilder() | ||
| .AddOllamaChatCompletion( | ||
| endpoint: new Uri(TestConfiguration.Ollama.Endpoint), | ||
| modelId: TestConfiguration.Ollama.ModelId) | ||
| .Build(); | ||
|
|
||
| var reply = await StreamMessageOutputFromKernelAsync(kernel, chatPrompt.ToString()); | ||
|
|
||
| chatPrompt.AppendLine($"<message role=\"assistant\"><![CDATA[{reply}]]></message>"); | ||
| chatPrompt.AppendLine("<message role=\"user\">I love history and philosophy, I'd like to learn something new about Greece, any suggestion</message>"); | ||
|
|
||
| reply = await StreamMessageOutputFromKernelAsync(kernel, chatPrompt.ToString()); | ||
|
|
||
| Console.WriteLine(reply); | ||
| } | ||
|
|
||
| /// <summary> | ||
| /// This example demonstrates how the chat completion service streams text content. | ||
| /// It shows how to access the response update via StreamingChatMessageContent.Content property | ||
| /// and alternatively via the StreamingChatMessageContent.Items property. | ||
| /// </summary> | ||
| [Fact] | ||
| public async Task StreamTextFromChatAsync() | ||
| { | ||
| Assert.NotNull(TestConfiguration.Ollama.ModelId); | ||
|
|
||
| Console.WriteLine("======== Stream Text from Chat Content ========"); | ||
|
|
||
| // Create chat completion service | ||
| var chatService = new OllamaChatCompletionService( | ||
| endpoint: new Uri(TestConfiguration.Ollama.Endpoint), | ||
| modelId: TestConfiguration.Ollama.ModelId); | ||
|
|
||
| // Create chat history with initial system and user messages | ||
| ChatHistory chatHistory = new("You are a librarian, an expert on books."); | ||
| chatHistory.AddUserMessage("Hi, I'm looking for book suggestions."); | ||
| chatHistory.AddUserMessage("I love history and philosophy. I'd like to learn something new about Greece, any suggestion?"); | ||
|
|
||
| // Start streaming chat based on the chat history | ||
| await foreach (StreamingChatMessageContent chatUpdate in chatService.GetStreamingChatMessageContentsAsync(chatHistory)) | ||
| { | ||
| // Access the response update via StreamingChatMessageContent.Content property | ||
| Console.Write(chatUpdate.Content); | ||
|
|
||
| // Alternatively, the response update can be accessed via the StreamingChatMessageContent.Items property | ||
| Console.Write(chatUpdate.Items.OfType<StreamingTextContent>().FirstOrDefault()); | ||
| } | ||
| } | ||
|
|
||
| private async Task StartStreamingChatAsync(IChatCompletionService chatCompletionService) | ||
| { | ||
| Console.WriteLine("Chat content:"); | ||
| Console.WriteLine("------------------------"); | ||
|
|
||
| var chatHistory = new ChatHistory("You are a librarian, expert about books"); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // First user message | ||
| chatHistory.AddUserMessage("Hi, I'm looking for book suggestions"); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // First assistant message | ||
| await StreamMessageOutputAsync(chatCompletionService, chatHistory, AuthorRole.Assistant); | ||
|
|
||
| // Second user message | ||
| chatHistory.AddUserMessage("I love history and philosophy, I'd like to learn something new about Greece, any suggestion?"); | ||
| this.OutputLastMessage(chatHistory); | ||
|
|
||
| // Second assistant message | ||
| await StreamMessageOutputAsync(chatCompletionService, chatHistory, AuthorRole.Assistant); | ||
| } | ||
|
|
||
| private async Task StreamMessageOutputAsync(IChatCompletionService chatCompletionService, ChatHistory chatHistory, AuthorRole authorRole) | ||
| { | ||
| bool roleWritten = false; | ||
| string fullMessage = string.Empty; | ||
|
|
||
| await foreach (var chatUpdate in chatCompletionService.GetStreamingChatMessageContentsAsync(chatHistory)) | ||
| { | ||
| if (!roleWritten && chatUpdate.Role.HasValue) | ||
| { | ||
| Console.Write($"{chatUpdate.Role.Value}: {chatUpdate.Content}"); | ||
| roleWritten = true; | ||
| } | ||
|
|
||
| if (chatUpdate.Content is { Length: > 0 }) | ||
| { | ||
| fullMessage += chatUpdate.Content; | ||
| Console.Write(chatUpdate.Content); | ||
| } | ||
| } | ||
|
|
||
| Console.WriteLine("\n------------------------"); | ||
| chatHistory.AddMessage(authorRole, fullMessage); | ||
| } | ||
|
|
||
| private async Task<string> StreamMessageOutputFromKernelAsync(Kernel kernel, string prompt) | ||
| { | ||
| bool roleWritten = false; | ||
| string fullMessage = string.Empty; | ||
|
|
||
| await foreach (var chatUpdate in kernel.InvokePromptStreamingAsync<StreamingChatMessageContent>(prompt)) | ||
| { | ||
| if (!roleWritten && chatUpdate.Role.HasValue) | ||
| { | ||
| Console.Write($"{chatUpdate.Role.Value}: {chatUpdate.Content}"); | ||
| roleWritten = true; | ||
| } | ||
|
|
||
| if (chatUpdate.Content is { Length: > 0 }) | ||
| { | ||
| fullMessage += chatUpdate.Content; | ||
| Console.Write(chatUpdate.Content); | ||
| } | ||
| } | ||
|
|
||
| Console.WriteLine("\n------------------------"); | ||
| return fullMessage; | ||
| } | ||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.