Skip to content

[docs] change placememt_policy to placement_policy#3829

Merged
FrankLeeeee merged 10 commits intohpcaitech:mainfrom
digger-yu:patch1
May 24, 2023
Merged

[docs] change placememt_policy to placement_policy#3829
FrankLeeeee merged 10 commits intohpcaitech:mainfrom
digger-yu:patch1

Conversation

@digger-yu
Copy link
Copy Markdown
Contributor

@digger-yu digger-yu commented May 24, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Not involved

📝 What does this PR do?

"change placememt_policy to placement_policy in docs/ and examples/
detail info
modified: docs/source/en/advanced_tutorials/parallelize_your_training_like_Megatron.md
modified: docs/source/en/features/zero_with_chunk.md
modified: docs/source/zh-Hans/advanced_tutorials/parallelize_your_training_like_Megatron.md
modified: docs/source/zh-Hans/features/zero_with_chunk.md
modified: examples/images/dreambooth/train_dreambooth_colossalai.py
modified: examples/images/dreambooth/train_dreambooth_colossalai_lora.py
modified: examples/language/palm/train.py

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@FrankLeeeee FrankLeeeee merged commit 518b31c into hpcaitech:main May 24, 2023
@digger-yu digger-yu deleted the patch1 branch May 24, 2023 07:03
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request Jun 7, 2023
* fix typo colossalai/autochunk auto_parallel amp

* fix typo colossalai/auto_parallel nn utils etc.

* fix typo colossalai/auto_parallel autochunk fx/passes  etc.

* fix typo docs/

* change placememt_policy to placement_policy in docs/ and examples/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants