Skip to content

[nfc]fix typo colossalai/pipeline tensor nn#3899

Merged
FrankLeeeee merged 19 commits intohpcaitech:mainfrom
digger-yu:patch1
Jun 6, 2023
Merged

[nfc]fix typo colossalai/pipeline tensor nn#3899
FrankLeeeee merged 19 commits intohpcaitech:mainfrom
digger-yu:patch1

Conversation

@digger-yu
Copy link
Copy Markdown
Contributor

@digger-yu digger-yu commented Jun 5, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Not involved

📝 What does this PR do?

fix typo colossalai/pipeline tensor nn
detail info
modified: colossalai/nn/optimizer/cpu_adam.py
modified: colossalai/nn/optimizer/hybrid_adam.py
modified: colossalai/pipeline/pipelinable.py
modified: colossalai/pipeline/rpc/_pipeline_base.py
modified: colossalai/pipeline/rpc/_pipeline_schedule.py
modified: colossalai/pipeline/utils.py
modified: colossalai/tensor/d_tensor/comm_spec.py
modified: colossalai/tensor/d_tensor/sharding_spec.py
modified: colossalai/tensor/param_op_hook.py
modified: colossalai/tensor/process_group.py
modified: colossalai/tensor/shape_consistency.py
modified: colossalai/tensor/sharding_spec.py
modified: colossalai/tensor/utils.py

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@FrankLeeeee FrankLeeeee merged commit 0e484e6 into hpcaitech:main Jun 6, 2023
@digger-yu digger-yu deleted the patch1 branch June 6, 2023 06:16
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request Jun 7, 2023
* fix typo colossalai/autochunk auto_parallel amp

* fix typo colossalai/auto_parallel nn utils etc.

* fix typo colossalai/auto_parallel autochunk fx/passes  etc.

* fix typo docs/

* change placememt_policy to placement_policy in docs/ and examples/

* fix typo colossalai/ applications/

* fix typo colossalai/cli fx kernel

* fix typo colossalai/nn

* revert change warmuped

* fix typo colossalai/pipeline tensor nn
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants