[WIP][TF] Fix t5 embeddings#14329
[WIP][TF] Fix t5 embeddings#14329patrickvonplaten wants to merge 3 commits intohuggingface:masterfrom
Conversation
| ) | ||
|
|
||
| if old_lm_head_decoder is not None and not is_input_output_equals: | ||
| if old_lm_head_decoder is not None and (not is_input_output_equals or not self.config.tie_word_embeddings): |
There was a problem hiding this comment.
I don't think is_input_output_equals is enough to decide whether the input and outputs are tied or not -> it's mostly the tie_word_embeddings attribute that does that
| if old_lm_head_decoder is not None and (not is_input_output_equals or not self.config.tie_word_embeddings): | ||
| old_embedding_dim = shape_list(old_lm_head_decoder)[1] | ||
| decoder_mask, current_decoder = init_copy_embeddings(old_lm_head_decoder, new_num_tokens) | ||
| name = old_lm_head_decoder.name.split(":")[0] if not tf.executing_eagerly() else None |
There was a problem hiding this comment.
Can't pass name in eager mode so disable it
| return embeds | ||
|
|
||
| # if embedding_layer is already a `tf.Tensor` simply output it | ||
| if isinstance(embedding_layer, tf.Tensor): |
There was a problem hiding this comment.
embedding_layer can be a tensor for T5 I think -> so return just return this?
…into fix_tf_mt5_resize_word_embeddings
sgugger
left a comment
There was a problem hiding this comment.
Thanks for fixing! Is the method value or values? The two are used in the diff.
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
|
Superseeded by: #15567 |
What does this PR do?
This PR is a first attempt to fix: #13839. In short T5 models that don't have input and output embeddings tied, can resize the embeddings.
Overall this whole TF resize embedding layer logic is incredibly complex and not readable...IMO, we should do a bigger refactor here.
TODO:
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.