Skip to content

[BUG]: Chat GPT inference still gives error #3192

@sdvies

Description

@sdvies

🐛 Describe the bug

[Referring to the #3061 bug]

I've obtained the actor_checkpoint_prompts.pt again with the new version of the train_prompts.py code and I'm still getting the same error:

RuntimeError: Error(s) in loading state_dict for BloomForCausalLM:
	Unexpected key(s) in state_dict: "transformer.h.2.input_layernorm.weight", "transformer.h.2.input_layernorm.bias", "transformer.h.2.self_attention.query_key_value.weight", "transformer.h.2.self_attention.query_key_value.bias", "transformer.h.2.self_attention.query_key_value.lora_A", "transformer.h.2.self_attention.query_key_value.lora_B" ...

Thank you!

Environment

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions