feat: Added examples for anthropic api call-chat and function calling#183
feat: Added examples for anthropic api call-chat and function calling#183
Conversation
There was a problem hiding this comment.
Hello @dhruvj07, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
Summary of Changes
This pull request adds two new example scripts for using the Javelin SDK with the Anthropic API. The first example, anthropic_api_function_calling.py, demonstrates how to use function calling with the Anthropic API. It defines a get_weather function and uses it to get the weather in Mumbai. The second example, javelin_anthropic_api_call.py, shows a basic API call to Anthropic's Claude model to get the three primary colors. Additionally, a minor change was made to javelin_azureopenai_univ_endpoint.py to include loading environment variables using load_dotenv().
Highlights
- Function Calling Example: Added
anthropic_api_function_calling.pyto demonstrate function calling with the Anthropic API, including defining a tool and usinginput_schema. - Basic API Call Example: Added
javelin_anthropic_api_call.pyto show a basic API call to Anthropic's Claude model, including setting up headers and formatting the query body. - Environment Variable Loading: Modified
javelin_azureopenai_univ_endpoint.pyto load environment variables usingload_dotenv().
Changelog
Click here to see the changelog
- examples/anthropic/anthropic_api_function_calling.py
- Created a new example script to demonstrate function calling with the Anthropic API.
- Configured the Javelin client with necessary API keys and headers.
- Defined a
get_weathertool with aninput_schemafor function calling. - Constructed a query body with messages, tools, and tool choice.
- Called the Javelin unified endpoint to execute the function call.
- examples/anthropic/javelin_anthropic_api_call.py
- Created a new example script to demonstrate a basic API call to Anthropic's Claude model.
- Configured the Javelin client with necessary API keys and headers.
- Formatted the query body to be compatible with Claude's
messagesformat. - Called the Javelin unified endpoint to query the Claude model.
- Added a helper function
print_responseto pretty print the response.
- examples/azure-openai/javelin_azureopenai_univ_endpoint.py
- Added
load_dotenv()to load environment variables from a.envfile. - Ensures that API keys and other configuration parameters can be set via environment variables.
- Added
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
In code's vast domain,
Anthropic's call, a sweet refrain,
Functions now sing,
New examples bring,
A clearer path, again, again.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Code Review
The pull request adds examples for Anthropic API calls, including chat and function calling. The examples seem well-structured and provide a good starting point for users looking to integrate with the Javelin SDK. However, there are a few areas that could be improved for clarity and maintainability.
Summary of Findings
- Missing Error Handling: The
anthropic_api_function_calling.pyexample lacks explicit error handling for the API call. Adding a try-except block would make the example more robust. - Inconsistent Header Definitions: The header definitions are repeated across multiple files. Consider creating a shared function or constant to avoid duplication.
- Clarity of Comments: Some comments could be more descriptive to explain the purpose of specific code sections, especially for those unfamiliar with the Javelin SDK.
Merge Readiness
The pull request introduces valuable examples for Anthropic API calls. However, addressing the missing error handling and inconsistent header definitions would significantly improve the code's robustness and maintainability. I recommend addressing these issues before merging. I am unable to approve this pull request, and users should have others review and approve this code before merging.
| # Call | ||
| response = client.query_unified_endpoint( | ||
| provider_name="anthropic", | ||
| endpoint_type="messages", | ||
| query_body=query_body, | ||
| ) | ||
|
|
There was a problem hiding this comment.
Consider adding a try-except block to handle potential exceptions during the API call. This will make the example more robust and provide better feedback in case of errors.
| # Call | |
| response = client.query_unified_endpoint( | |
| provider_name="anthropic", | |
| endpoint_type="messages", | |
| query_body=query_body, | |
| ) | |
| try: | |
| response = client.query_unified_endpoint( | |
| provider_name="anthropic", | |
| endpoint_type="messages", | |
| query_body=query_body, | |
| ) | |
| print(response) | |
| except Exception as e: | |
| print(f"Error calling Anthropic API: {e}") |
| response = client.query_unified_endpoint( | ||
| provider_name="anthropic", | ||
| endpoint_type="messages", | ||
| query_body=query_body, | ||
| ) | ||
| print_response("Anthropic", response) | ||
| except Exception as e: | ||
| print(f"Anthropic query failed: {str(e)}") |
There was a problem hiding this comment.
| custom_headers = { | ||
| "Content-Type": "application/json", | ||
| "x-javelin-route": "anthropic_univ", | ||
| "x-javelin-model": "claude-3-5-sonnet-20240620", | ||
| "x-javelin-provider": "https://api.anthropic.com/v1", | ||
| "x-api-key": os.getenv("ANTHROPIC_API_KEY"), # For Anthropic model | ||
| "anthropic-version": "2023-06-01", | ||
| } |
No description provided.