-
Notifications
You must be signed in to change notification settings - Fork 4.6k
Python: #6499 Mistral AI Chat Completion #7049
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
moonbox3
merged 22 commits into
microsoft:main
from
nmoeller:issue-6499-Mistral-Ai-Connector-chat-completion
Jul 4, 2024
Merged
Changes from all commits
Commits
Show all changes
22 commits
Select commit
Hold shift + click to select a range
f672801
Added MistralAI ChatCompletion
nmoellerms 1155291
Integrated Feedback of PR
nmoellerms 2a34856
Merge remote-tracking branch 'origin/main' into issue-6499-Mistral-Ai…
nmoellerms 27b7cbb
added mistral to unit test dependencies
nmoellerms 0ae805d
removed tools from settings
nmoellerms 6b2025f
fixed comment and pytestfixture and lock file
nmoellerms e986b01
adjusted test cases to not conatin tools
nmoellerms b6cc9e3
handle function choice behavior
nmoellerms 29efd0b
Merge branch 'main' into issue-6499-Mistral-Ai-Connector-chat-completion
nmoeller 11295d7
fixed mypy issues except liskov
nmoellerms a894710
increased test coverage
nmoellerms 87a8a67
Merge branch 'main' into issue-6499-Mistral-Ai-Connector-chat-completion
nmoeller 14218da
full test coverage
nmoellerms 0bebbf4
Merge branch 'issue-6499-Mistral-Ai-Connector-chat-completion' of htt…
nmoellerms 46c997f
Integrated PR Feedback and skip Int Tests if Mistral is not configured
nmoellerms 0ff93c8
Merge branch 'main' into issue-6499-Mistral-Ai-Connector-chat-completion
nmoeller b55ed11
small fix for skipping integration tests
nmoellerms 297373f
Merge branch 'issue-6499-Mistral-Ai-Connector-chat-completion' of htt…
nmoellerms 88ce112
Merge branch 'main' into issue-6499-Mistral-Ai-Connector-chat-completion
nmoeller a5f8bea
skiped MistralConstructor in TestSetup
nmoellerms cda788c
Merge branch 'issue-6499-Mistral-Ai-Connector-chat-completion' of htt…
nmoellerms a8e90ae
Merge branch 'main' into issue-6499-Mistral-Ai-Connector-chat-completion
nmoeller File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Oops, something went wrong.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
86 changes: 86 additions & 0 deletions
86
python/samples/concepts/chat_completion/chat_mistral_api.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,86 @@ | ||
| # Copyright (c) Microsoft. All rights reserved. | ||
|
|
||
| import asyncio | ||
|
|
||
| from semantic_kernel import Kernel | ||
| from semantic_kernel.connectors.ai.mistral_ai import MistralAIChatCompletion | ||
| from semantic_kernel.contents import ChatHistory | ||
|
|
||
| system_message = """ | ||
| You are a chat bot. Your name is Mosscap and | ||
| you have one goal: figure out what people need. | ||
| Your full name, should you need to know it, is | ||
| Splendid Speckled Mosscap. You communicate | ||
| effectively, but you tend to answer with long | ||
| flowery prose. | ||
| """ | ||
|
|
||
| kernel = Kernel() | ||
|
|
||
| service_id = "mistral-ai-chat" | ||
| kernel.add_service(MistralAIChatCompletion(service_id=service_id)) | ||
|
|
||
| settings = kernel.get_prompt_execution_settings_from_service_id(service_id) | ||
| settings.max_tokens = 2000 | ||
| settings.temperature = 0.7 | ||
| settings.top_p = 0.8 | ||
|
|
||
| chat_function = kernel.add_function( | ||
| plugin_name="ChatBot", | ||
| function_name="Chat", | ||
| prompt="{{$chat_history}}{{$user_input}}", | ||
| template_format="semantic-kernel", | ||
| prompt_execution_settings=settings, | ||
| ) | ||
|
|
||
| chat_history = ChatHistory(system_message=system_message) | ||
| chat_history.add_user_message("Hi there, who are you?") | ||
| chat_history.add_assistant_message("I am Mosscap, a chat bot. I'm trying to figure out what people need") | ||
| chat_history.add_user_message("I want to find a hotel in Seattle with free wifi and a pool.") | ||
|
|
||
|
|
||
| async def chat() -> bool: | ||
| try: | ||
| user_input = input("User:> ") | ||
| except KeyboardInterrupt: | ||
| print("\n\nExiting chat...") | ||
| return False | ||
| except EOFError: | ||
| print("\n\nExiting chat...") | ||
| return False | ||
|
|
||
| if user_input == "exit": | ||
| print("\n\nExiting chat...") | ||
| return False | ||
|
|
||
| stream = True | ||
| if stream: | ||
| answer = kernel.invoke_stream( | ||
| chat_function, | ||
| user_input=user_input, | ||
| chat_history=chat_history, | ||
| ) | ||
| print("Mosscap:> ", end="") | ||
| async for message in answer: | ||
| print(str(message[0]), end="") | ||
| print("\n") | ||
| return True | ||
| answer = await kernel.invoke( | ||
| chat_function, | ||
| user_input=user_input, | ||
| chat_history=chat_history, | ||
| ) | ||
| print(f"Mosscap:> {answer}") | ||
| chat_history.add_user_message(user_input) | ||
| chat_history.add_assistant_message(str(answer)) | ||
| return True | ||
|
|
||
|
|
||
| async def main() -> None: | ||
| chatting = True | ||
| while chatting: | ||
| chatting = await chat() | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| asyncio.run(main()) |
11 changes: 11 additions & 0 deletions
11
python/semantic_kernel/connectors/ai/mistral_ai/__init__.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,11 @@ | ||
| # Copyright (c) Microsoft. All rights reserved. | ||
|
|
||
| from semantic_kernel.connectors.ai.mistral_ai.prompt_execution_settings.mistral_ai_prompt_execution_settings import ( | ||
| MistralAIChatPromptExecutionSettings, | ||
| ) | ||
| from semantic_kernel.connectors.ai.mistral_ai.services.mistral_ai_chat_completion import MistralAIChatCompletion | ||
|
|
||
| __all__ = [ | ||
| "MistralAIChatCompletion", | ||
| "MistralAIChatPromptExecutionSettings", | ||
| ] |
Empty file.
38 changes: 38 additions & 0 deletions
38
...onnectors/ai/mistral_ai/prompt_execution_settings/mistral_ai_prompt_execution_settings.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,38 @@ | ||
| # Copyright (c) Microsoft. All rights reserved. | ||
|
|
||
| import logging | ||
| from typing import Any, Literal | ||
|
|
||
| from pydantic import Field, model_validator | ||
|
|
||
| from semantic_kernel.connectors.ai.prompt_execution_settings import PromptExecutionSettings | ||
|
|
||
| logger = logging.getLogger(__name__) | ||
|
|
||
|
|
||
| class MistralAIPromptExecutionSettings(PromptExecutionSettings): | ||
| """Common request settings for MistralAI services.""" | ||
|
|
||
| ai_model_id: str | None = Field(None, serialization_alias="model") | ||
|
|
||
|
|
||
| class MistralAIChatPromptExecutionSettings(MistralAIPromptExecutionSettings): | ||
| """Specific settings for the Chat Completion endpoint.""" | ||
|
|
||
| response_format: dict[Literal["type"], Literal["text", "json_object"]] | None = None | ||
| messages: list[dict[str, Any]] | None = None | ||
| safe_mode: bool = False | ||
| safe_prompt: bool = False | ||
| max_tokens: int | None = Field(None, gt=0) | ||
| seed: int | None = None | ||
| temperature: float | None = Field(None, ge=0.0, le=2.0) | ||
| top_p: float | None = Field(None, ge=0.0, le=1.0) | ||
| random_seed: int | None = None | ||
|
|
||
| @model_validator(mode="after") | ||
| def check_function_call_behavior(self) -> "MistralAIChatPromptExecutionSettings": | ||
| """Check if the user is requesting function call behavior.""" | ||
| if self.function_choice_behavior is not None: | ||
| raise NotImplementedError("MistralAI does not support function call behavior.") | ||
|
|
||
| return self |
Empty file.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.