Nit about model_accepts_loss_kwargs for loss#35113
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
from transformers import Wav2Vec2Model
import inspect
sif = inspect.signature(Wav2Vec2Model.forward)
any(k.kind == inspect.Parameter.VAR_KEYWORD for k in sif.parameters.values())Returns false. But when in the example torch, you have unwrapped_model.forward seems to have loss_kwargs |
|
This PR still confuses me as I mentioned in previes PR.
|
|
Hey!
No,
Even if user has a self defined loss function, if the |
|
@ArthurZucker I disagree with 2, it was explicitly designed that way to catch this in case the model doesn't have this figured out yet and the user can/should manually do it themselves instead |
What does this PR do?
There was a typo in #34915, as tests were passing, did not pay attention to it. Thanks to @techkang for reporting