Skip to content

[Coloattention] update coloattention tests for checking outputs and gradients#4389

Merged
kurisusnowdeng merged 1 commit intohpcaitech:mainfrom
flybird11111:update-coloattention
Aug 9, 2023
Merged

[Coloattention] update coloattention tests for checking outputs and gradients#4389
kurisusnowdeng merged 1 commit intohpcaitech:mainfrom
flybird11111:update-coloattention

Conversation

@flybird11111
Copy link
Copy Markdown
Contributor

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

update coloattention tests for checking outputs and gradients.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

Comment thread tests/test_utils/test_flash_attention.py Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
@flybird11111 flybird11111 force-pushed the update-coloattention branch from ec0d302 to 240f064 Compare August 8, 2023 09:22
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Aug 8, 2023

The code coverage for the changed files is %.

Click me to view the complete report
Name                                       Stmts   Miss  Cover
--------------------------------------------------------------
tests/test_utils/test_flash_attention.py     125      0   100%
--------------------------------------------------------------
TOTAL                                        125      0   100%

@flybird11111 flybird11111 force-pushed the update-coloattention branch from 35469f6 to 27075c4 Compare August 8, 2023 11:01
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Aug 8, 2023

The code coverage for the changed files is %.

Click me to view the complete report
Name                                       Stmts   Miss  Cover
--------------------------------------------------------------
tests/test_utils/test_flash_attention.py     124      0   100%
--------------------------------------------------------------
TOTAL                                        124      0   100%

Comment thread requirements/requirements-test.txt Outdated
Comment thread requirements/requirements.txt Outdated
Comment thread tests/test_utils/test_flash_attention.py Outdated
@flybird11111 flybird11111 force-pushed the update-coloattention branch 2 times, most recently from 3d4265b to 1f1af9e Compare August 9, 2023 04:19
[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[shardformer] coloattention support flash attention 2

[coloattention] update coloattention tests of checking outputs and gradients

[coloattention] update coloattention tests of checking outputs and gradients, fix

[coloattention] update coloattention tests of checking outputs and gradients, fix

[coloattention] fix

[coloattention] fix
@flybird11111 flybird11111 force-pushed the update-coloattention branch from 1f1af9e to 4692b54 Compare August 9, 2023 05:37
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Aug 9, 2023

The code coverage for the changed files is %.

Click me to view the complete report
Name                                       Stmts   Miss  Cover
--------------------------------------------------------------
tests/test_utils/test_flash_attention.py     120      0   100%
--------------------------------------------------------------
TOTAL                                        120      0   100%

1 similar comment
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Aug 9, 2023

The code coverage for the changed files is %.

Click me to view the complete report
Name                                       Stmts   Miss  Cover
--------------------------------------------------------------
tests/test_utils/test_flash_attention.py     120      0   100%
--------------------------------------------------------------
TOTAL                                        120      0   100%

@kurisusnowdeng kurisusnowdeng merged commit 458ae33 into hpcaitech:main Aug 9, 2023
jamesthesnake added a commit to jamesthesnake/ColossalAI that referenced this pull request Aug 10, 2023
[kernel] updated unittests for coloattention (hpcaitech#4389)
@flybird11111 flybird11111 deleted the update-coloattention branch April 11, 2024 03:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants