Skip to content

Conversation

@lizhenyun01
Copy link
Collaborator

@lizhenyun01 lizhenyun01 commented Nov 19, 2025

Motivation

支持flash_mask_attention的backend, 只在prefill阶段生效,使用方式export FD_ATTENTION_BACKEND=FLASH_MASK_ATTN
gqa_rope_write_cache算子原生单测缺失,期望作为命题补全或后续有人力支持

Modifications

Usage or Command

Accuracy Tests

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link

paddle-bot bot commented Nov 19, 2025

Thanks for your contribution!

@lizhenyun01 lizhenyun01 changed the title [Feature] suppert flash_mask_attention backend [Feature] support flash_mask_attention backend Nov 20, 2025
@codecov-commenter
Copy link

codecov-commenter commented Nov 21, 2025

Codecov Report

❌ Patch coverage is 38.01653% with 75 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (develop@cb56d46). Learn more about missing BASE report.

Files with missing lines Patch % Lines
...ecutor/layers/attention/flash_mask_attn_backend.py 36.11% 67 Missing and 2 partials ⚠️
...cutor/layers/attention/ops/flash_mask_attention.py 57.14% 3 Missing ⚠️
fastdeploy/platforms/cuda.py 0.00% 2 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             develop    #5134   +/-   ##
==========================================
  Coverage           ?   59.86%           
==========================================
  Files              ?      319           
  Lines              ?    38899           
  Branches           ?     5857           
==========================================
  Hits               ?    23288           
  Misses             ?    13775           
  Partials           ?     1836           
Flag Coverage Δ
GPU 59.86% <38.01%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator

@carryyu carryyu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@yangjianfengo1
Copy link
Contributor

LGTM

@qingqing01 qingqing01 merged commit aba4fc6 into PaddlePaddle:develop Nov 28, 2025
13 of 18 checks passed
qingqing01 pushed a commit that referenced this pull request Nov 28, 2025
…5256)

* [Feature] suppert flash_mask_attention backend

* fix unittest

* clean code
@lizhenyun01 lizhenyun01 deleted the fa3 branch December 25, 2025 08:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants