Skip to content

FIX Prefix tuning after transformers PR 38635#2662

Merged
BenjaminBossan merged 2 commits intohuggingface:mainfrom
BenjaminBossan:fix-prefix-tuning-after-transformers-38635
Jul 22, 2025
Merged

FIX Prefix tuning after transformers PR 38635#2662
BenjaminBossan merged 2 commits intohuggingface:mainfrom
BenjaminBossan:fix-prefix-tuning-after-transformers-38635

Conversation

@BenjaminBossan
Copy link
Copy Markdown
Member

@BenjaminBossan BenjaminBossan commented Jul 22, 2025

Due to huggingface/transformers#38635, several tests involving prefix tuning broke.

This PR fixes this by resolving two issues:

  1. The _supports_cache_class attribute was removed, we can now assume that it is True if the attribute does not exist.

  2. We had special handling of past_key_values for GPTBigCodeForCausalLM which is no longer required (nor valid) after that PR, so it is removed depending on the transformers version.

Due to huggingface/transformers#38635, several
tests involving prefix tuning broke:

https://github.com/huggingface/peft/actions/runs/16417140904/job/46385751329

This PR fixes this by resoling two issues:

1. The _supports_cache_class attribute was removed, we can now assume
that it is True if the attribute does not exist.

2. We had special handling of past_key_values for GPTBigCodeForCausalLM
which is no longer required (nor valid) after that PR, so it is removed
depending on the transformers version.
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Copy Markdown
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nicee

Copy link
Copy Markdown
Collaborator

@githubnemo githubnemo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks :)

@BenjaminBossan BenjaminBossan merged commit e779245 into huggingface:main Jul 22, 2025
12 of 14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-prefix-tuning-after-transformers-38635 branch July 22, 2025 11:59
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Jul 28, 2025
Due to huggingface/transformers#38635, several
tests involving prefix tuning broke:

https://github.com/huggingface/peft/actions/runs/16417140904/job/46385751329

This PR fixes this by resoling two issues:

1. The _supports_cache_class attribute was removed, we can now assume
that it is True if the attribute does not exist.

2. We had special handling of past_key_values for GPTBigCodeForCausalLM
which is no longer required (nor valid) after that PR, so it is removed
depending on the transformers version.
cyyever pushed a commit to cyyever/peft that referenced this pull request Sep 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants