Skip to content

[autoparallel] Patch meta information of torch.where#2822

Merged
YuliangLiu0306 merged 63 commits intohpcaitech:mainfrom
Cypher30:feature/where_meta_info
Feb 22, 2023
Merged

[autoparallel] Patch meta information of torch.where#2822
YuliangLiu0306 merged 63 commits intohpcaitech:mainfrom
Cypher30:feature/where_meta_info

Conversation

@Cypher30
Copy link
Copy Markdown
Contributor

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234
Resolved #2800

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.
In this PR, I patch the meta information of torch.where
The test will be skipped on CI, I attach test results on torch 1.12.0 here
Screenshot 2023-02-19 at 15 53 00

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

Cypher30 and others added 30 commits July 14, 2022 16:07
@Cypher30 Cypher30 added Run Build and Test auto-parallel related to the auto-parallel feature labels Feb 19, 2023
@github-actions
Copy link
Copy Markdown
Contributor

Your pre-commit check failed, follow the steps to run pre-commit on your file for code style consistency.

  1. install pre-commit via "pip install pre-commit"
  2. install pre-commit hooks via "pre-commit install"
  3. run pre-commit on file with format error via "pre-commit run --files path" by replacing "path" with the actual file path
  4. commit and push to your branch

View your job at https://github.com/hpcaitech/ColossalAI/actions/runs/4222899572.
Read our "CONTRIBUTING.md" for more reference to the code style.

1 similar comment
@github-actions
Copy link
Copy Markdown
Contributor

Your pre-commit check failed, follow the steps to run pre-commit on your file for code style consistency.

  1. install pre-commit via "pip install pre-commit"
  2. install pre-commit hooks via "pre-commit install"
  3. run pre-commit on file with format error via "pre-commit run --files path" by replacing "path" with the actual file path
  4. commit and push to your branch

View your job at https://github.com/hpcaitech/ColossalAI/actions/runs/4222899572.
Read our "CONTRIBUTING.md" for more reference to the code style.

@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 42%.

Click me to view the complete report
Name                                                                              Stmts   Miss  Cover
-----------------------------------------------------------------------------------------------------
colossalai/auto_parallel/meta_profiler/meta_registry/__init__.py                      9      0   100%
colossalai/auto_parallel/meta_profiler/meta_registry/where.py                        25     16    36%
tests/test_auto_parallel/test_tensor_shard/test_metainfo/test_where_metainfo.py      51     33    35%
-----------------------------------------------------------------------------------------------------
TOTAL                                                                                85     49    42%

@YuliangLiu0306 YuliangLiu0306 merged commit c7764d3 into hpcaitech:main Feb 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

auto-parallel related to the auto-parallel feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Patch meta information of torch.where()

2 participants