Skip to content

added attn and mlp bias#83

Closed
JRosenkranz wants to merge 3 commits intoIBM:mainfrom
JRosenkranz:added_attn_mlp_bias
Closed

added attn and mlp bias#83
JRosenkranz wants to merge 3 commits intoIBM:mainfrom
JRosenkranz:added_attn_mlp_bias

Conversation

@JRosenkranz
Copy link
Copy Markdown
Collaborator

Motivation

[Describe why this change is needed]

The Calico models currently set the mlp and attention bias to true, which was hard-coded to false in flash and paged llama implementations. This will use the config params set in huggingface/transformers#30031 to set those values properly.

Modifications

[Describe the code changes]

  • added attention_bias, mlp_bias to config for Flash and Paged Llama implementations (default is False)
  • set bias in attention and mlp to the config value

Result

[Describe how the changes affects existing behavior and how to test it]

Models should be able to load properly if containing attention and mlp bias

Related Issues

NA

@JRosenkranz JRosenkranz closed this May 6, 2024
Xaenalt pushed a commit to Xaenalt/text-generation-inference that referenced this pull request Aug 1, 2024
[pull] main from IBM:main
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant