🐛 Describe the bug
LlamaRM is not a huggingface transformer module but LoraModule, while llamaRM.model is a huggingface transformer model. So LlamaRm has no function "resize_token_embeddings" but LlamaRM.model has. When using Llama to train RM model, it will raise the error: AttributeError: 'LlamaRM' object has no attribute 'resize_token_embeddings'
line 68 in coati/utils/tokenizer_utils.py
Environment
No response
🐛 Describe the bug
LlamaRM is not a huggingface transformer module but LoraModule, while llamaRM.model is a huggingface transformer model. So LlamaRm has no function "resize_token_embeddings" but LlamaRM.model has. When using Llama to train RM model, it will raise the error: AttributeError: 'LlamaRM' object has no attribute 'resize_token_embeddings'
line 68 in coati/utils/tokenizer_utils.py
Environment
No response