chore(deps): update dependency litellm to v1.61.15#2
Open
dev-mend-for-github-com[bot] wants to merge 1 commit intomasterfrom
Open
chore(deps): update dependency litellm to v1.61.15#2dev-mend-for-github-com[bot] wants to merge 1 commit intomasterfrom
dev-mend-for-github-com[bot] wants to merge 1 commit intomasterfrom
Conversation
14fb826 to
1633d6a
Compare
f9b7c63 to
1229c8b
Compare
b32f484 to
0cdb841
Compare
cc9d6a5 to
aff3e4e
Compare
31e119f to
6a5f4b1
Compare
7d87e61 to
f47804b
Compare
d8909e2 to
5bd055d
Compare
dce5a5c to
25b15e0
Compare
4eaff20 to
b64df2b
Compare
45b4172 to
a60ab53
Compare
de0e0e7 to
ccf3ccb
Compare
ccf3ccb to
9966989
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
1.17.9→1.61.15By merging this PR, the below vulnerabilities will be automatically resolved:
By merging this PR, the below vulnerabilities will be automatically resolved:
Release Notes
BerriAI/litellm (litellm)
v1.61.7What's Changed
return_citationsdocumentation by @miraclebakelaser in #8527/bedrock/meta.llama3-3-70b-instruct-v1:0tool calling support + cost tracking + base llm unit test for tool calling by @ishaan-jaff in #8545/completionsroute by @ishaan-jaff in #8551x-litellm-attempted-fallbacksin responses from litellm proxy by @ishaan-jaff in #8558New Contributors
Full Changelog: BerriAI/litellm@v1.61.3...v1.61.7
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.3What's Changed
/modelsand/model_group/infoby @krrishdholakia in #8473include_usagefor /completions requests + unit testing by @ishaan-jaff in #8484PerplexityChatConfig- track correct OpenAI compatible params by @ishaan-jaff in #8496-nightlyby @krrishdholakia in #8499gemini-2.0-pro-exp-02-05vertex ai model to cost map + Newbedrock/deepseek_r1/*route by @krrishdholakia in #8525Full Changelog: BerriAI/litellm@v1.61.1...v1.61.3
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.1Compare Source
What's Changed
Full Changelog: BerriAI/litellm@v1.61.0...v1.61.1
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.0What's Changed
/bedrock/invoke/by @ishaan-jaff in #8397/team/updates in multi-instance deployments with Redis by @ishaan-jaff in #8440New Contributors
Full Changelog: BerriAI/litellm@v1.60.8...v1.61.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.8What's Changed
/cache/ping+ add timeout value and elapsed time on azure + http calls by @krrishdholakia in #8377/bedrock/invokesupport for all Anthropic models by @ishaan-jaff in #8383Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.6Compare Source
What's Changed
choices=[]by @ishaan-jaff in #8339choices=[]on llm responses by @ishaan-jaff in #8342New Contributors
Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.5Compare Source
What's Changed
BaseLLMHTTPHandlerclass by @ishaan-jaff in #8290New Contributors
Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.4Compare Source
What's Changed
bedrock/novamodels + add utillitellm.supports_tool_choiceby @ishaan-jaff in #8264rolebased access to proxy by @krrishdholakia in #8260Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.2Compare Source
What's Changed
sso_user_idto LiteLLM_UserTable by @krrishdholakia in #8167/vertex_ai/was not detected as llm_api_route on pass through butvertex-aiwas by @ishaan-jaff in #8186modeas list, fix valid keys error in pydantic, add more testing by @krrishdholakia in #8224New Contributors
Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.0What's Changed
Important Changes between v1.50.xx to 1.60.0
def async_log_stream_eventanddef log_stream_eventno longer supported forCustomLoggershttps://docs.litellm.ai/docs/observability/custom_callback. If you want to log stream events usedef async_log_success_eventanddef log_success_eventfor logging success stream eventsKnown Issues
🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB
bedrockmodels + showend_userby @ishaan-jaff in #8118keyTeam.team_alias === "Default Team"by @ishaan-jaff in #8122LoggingCallbackManagerto append callbacks and ensure no duplicate callbacks are added by @ishaan-jaff in #8112litellm.disable_no_log_paramparam by @krrishdholakia in #8134litellm.turn_off_message_logging=Trueby @ishaan-jaff in #8156New Contributors
Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.10Compare Source
What's Changed
modelparam by @ishaan-jaff in #8105bedrock/converse_like/<model>route by @krrishdholakia in #8102Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.9Compare Source
What's Changed
metadataparam preview support + newx-litellm-timeoutrequest header by @krrishdholakia in #8047New Contributors
Full Changelog: BerriAI/litellm@v1.59.8...v1.59.9
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| ----------------- | -------- | ------------------------