Skip to content

Fix layout convert caching#5515

Merged
Edenzzzz merged 2 commits intohpcaitech:mainfrom
Edenzzzz:fix_layout_convert
Mar 26, 2024
Merged

Fix layout convert caching#5515
Edenzzzz merged 2 commits intohpcaitech:mainfrom
Edenzzzz:fix_layout_convert

Conversation

@Edenzzzz
Copy link
Copy Markdown
Contributor

@Edenzzzz Edenzzzz commented Mar 26, 2024

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

📝 What does this PR do?

The old layout caching only checks if the process group still exists regardless of the final shared shape. Calling a sharding function on tensors with different shapes incorrectly leads to the same sharded shape.
I added sharded shape check into caching.

if you have any plots/diagrams/screenshots/tables, please attach them here.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@Edenzzzz Edenzzzz requested a review from a team as a code owner March 26, 2024 11:03
@Edenzzzz Edenzzzz force-pushed the fix_layout_convert branch from 233642d to 61da3fb Compare March 26, 2024 11:06
@Edenzzzz Edenzzzz merged commit 9a3321e into hpcaitech:main Mar 26, 2024
@Edenzzzz Edenzzzz deleted the fix_layout_convert branch April 15, 2024 07:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants