🐛 Describe the bug
Llama2 HybridParallelPlugin train failed when pp_size>1
we modified the llama training example to use hybridparallel plugin but encountered such an error.
The hidden_states is None in colossalai/shardformer/modeling/llama.py:61 in llama_model_forward

Environment
torch-'1.13.1+cu117', transformers-4.32.0, colossalAI-release-v0.3.2
🐛 Describe the bug
Llama2 HybridParallelPlugin train failed when pp_size>1

we modified the llama training example to use hybridparallel plugin but encountered such an error.
The hidden_states is None in colossalai/shardformer/modeling/llama.py:61 in llama_model_forward
Environment
torch-'1.13.1+cu117', transformers-4.32.0, colossalAI-release-v0.3.2