Skip to content

[Bug]4.23.6版本对话模型获取失败 #7886

@SARFF6

Description

@SARFF6

What happened / 发生了什么

在win11系统下载的astrbot桌面版更新到4.23.6版本后,本地部署的对话模型和DeepSeek模型都无法获取和使用

Reproduce / 如何复现?

点开模型提供商,点击获取模型列表

AstrBot version, deployment method (e.g., Windows Docker Desktop deployment), provider used, and messaging platform used. / AstrBot 版本、部署方式(如 Windows Docker Desktop 部署)、使用的提供商、使用的消息平台适配器

桌面客户端部署4.23.6版本,deep seek模型,QQ开放平台

OS

Windows

Logs / 报错日志

[2026-04-29 12:45:16.343] [Core] [INFO] [provider.manager:572]: 载入 openai_chat_completion(deepseek-v4-flash) 服务提供商 ...
[2026-04-29 12:45:16.344] [Core] [ERRO] [v4.23.6] [provider.manager:709]: 实例化 openai_chat_completion(deepseek-v4-flash) 提供商适配器失败:Invalid http_client argument; Expected an instance of httpx.AsyncClient but got <class 'httpx.AsyncClient'>
[2026-04-29 12:47:48.720] [Core] [INFO] [provider.manager:726]: providers in user's config: ['deepseek-v4-flash', 'openai/deepseek-v4-pro', 'ollama/moondream:latest', 'mxbai-embed-large:latest', 'openai/deepseek-chat', 'openai/deepseek-reasoner']
[2026-04-29 12:47:48.933] [Core] [INFO] [provider.manager:572]: 载入 openai_chat_completion(deepseek-v4-flash) 服务提供商 ...
[2026-04-29 12:47:48.934] [Core] [ERRO] [v4.23.6] [provider.manager:709]: 实例化 openai_chat_completion(deepseek-v4-flash) 提供商适配器失败:Invalid http_client argument; Expected an instance of httpx.AsyncClient but got <class 'httpx.AsyncClient'>
[2026-04-29 12:47:49.530] [Core] [INFO] [provider.manager:726]: providers in user's config: ['deepseek-v4-flash', 'openai/deepseek-v4-pro', 'ollama/moondream:latest', 'mxbai-embed-large:latest', 'openai/deepseek-chat', 'openai/deepseek-reasoner']
[2026-04-29 12:47:50.159] [Core] [INFO] [provider.manager:572]: 载入 openai_chat_completion(openai/deepseek-v4-pro) 服务提供商 ...
[2026-04-29 12:47:50.160] [Core] [ERRO] [v4.23.6] [provider.manager:709]: 实例化 openai_chat_completion(openai/deepseek-v4-pro) 提供商适配器失败:Invalid http_client argument; Expected an instance of httpx.AsyncClient but got <class 'httpx.AsyncClient'>
[2026-04-29 12:48:00.080] [Core] [INFO] [provider.manager:726]: providers in user's config: ['deepseek-v4-flash', 'openai/deepseek-v4-pro', 'ollama/moondream:latest', 'mxbai-embed-large:latest', 'openai/deepseek-chat', 'openai/deepseek-reasoner']
[2026-04-29 12:48:00.499] [Core] [INFO] [provider.manager:572]: 载入 openai_chat_completion(ollama/moondream:latest) 服务提供商 ...
[2026-04-29 12:48:00.500] [Core] [ERRO] [v4.23.6] [provider.manager:709]: 实例化 openai_chat_completion(ollama/moondream:latest) 提供商适配器失败:Invalid http_client argument; Expected an instance of httpx.AsyncClient but got <class 'httpx.AsyncClient'>
[2026-04-29 12:48:00.978] [Core] [INFO] [routes.config:743]: API call: /config/provider/check_one id=ollama/moondream:latest
[2026-04-29 12:48:00.979] [Core] [WARN] [v4.23.6] [routes.config:749]: Provider with id 'ollama/moondream:latest' not found in provider_manager.
[2026-04-29 12:48:05.108] [Core] [INFO] [routes.config:743]: API call: /config/provider/check_one id=mxbai-embed-large:latest
[2026-04-29 12:48:06.684] [Core] [INFO] [routes.config:706]: Provider mxbai-embed-large:latest (ID: mxbai-embed-large:latest) is available.

Are you willing to submit a PR? / 你愿意提交 PR 吗?

  • Yes!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:providerThe bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner.bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions