Skip to content

[Bug] deepseek-v4-flash 思考模式多轮对话报 400:The reasoning_content in the thinking mode must be passed back to the API. #7826

@LWDJD

Description

@LWDJD

What happened / 发生了什么

在更新了v4.23.5之后,使用“deepseek-v4-flash”模型ID时会返回400报错,但降级到旧模型ID “deepseek-reasoner”后可以正常使用。

Reproduce / 如何复现?

使用v4.23.5版本,添加OpenAI格式的DeepSeek模型提供商,选择使用“deepseek-v4-flash”模型ID,我在自定义请求体参数中添加了{"reasoning_effort":"max"},不过这个自定义请求体参数可能没有什么影响。

AstrBot version, deployment method (e.g., Windows Docker Desktop deployment), provider used, and messaging platform used. / AstrBot 版本、部署方式(如 Windows Docker Desktop 部署)、使用的提供商、使用的消息平台适配器

v4.23.5
Linux Docker
DeepSeek(OpenAI API)
QQ Bot(官方)

OS

Linux

Logs / 报错日志

[2026-04-27 07:18:06.512] [Core] [WARN] [v4.23.5] [runners.tool_loop_agent_runner:555]: Chat Model deepseek/deepseek-v4-flash request error: Error code: 400 - {'error': {'message': 'The reasoning_content in the thinking mode must be passed back to the API.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
Traceback (most recent call last):
File "/AstrBot/astrbot/core/agent/runners/tool_loop_agent_runner.py", line 510, in _iter_llm_responses_with_fallback
async for attempt in retrying:
File "/usr/local/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 170, in anext
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 157, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tenacity/_utils.py", line 111, in inner
return call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tenacity/init.py", line 393, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/AstrBot/astrbot/core/agent/runners/tool_loop_agent_runner.py", line 514, in _iter_llm_responses_with_fallback
async for resp in self._iter_llm_responses(
File "/AstrBot/astrbot/core/agent/runners/tool_loop_agent_runner.py", line 477, in _iter_llm_responses
yield await self.provider.text_chat(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/AstrBot/astrbot/core/provider/sources/openai_source.py", line 1165, in text_chat
) = await self._handle_api_error(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/AstrBot/astrbot/core/provider/sources/openai_source.py", line 1111, in _handle_api_error
raise e
File "/AstrBot/astrbot/core/provider/sources/openai_source.py", line 1153, in text_chat
llm_response = await self._query(payloads, func_tool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/AstrBot/astrbot/core/provider/sources/openai_source.py", line 572, in _query
completion = await self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2700, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1884, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1669, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'The reasoning_content in the thinking mode must be passed back to the API.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
[2026-04-27 07:18:06.532] [Core] [WARN] [v4.23.5] [runners.tool_loop_agent_runner:492]: Switched from deepseek/deepseek-v4-flash to fallback chat provider: deepseek/deepseek-reasoner

Are you willing to submit a PR? / 你愿意提交 PR 吗?

  • Yes!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:providerThe bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner.bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions