🐛 Describe the bug
python inference.py --model_path ./actor_checkpoint_prompts.pt --pretrain bloom-560m --model bloom
size mismatch for transformer.ln_f.weight: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for transformer.ln_f.bias: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for lm_head.weight: copying a param with shape torch.Size([50257, 768]) from checkpoint, the shape in current model is torch.Size([250880, 1024]).
Environment
No response
🐛 Describe the bug
Environment
No response