Fix pipeline tests - don't set torch_dtype on non-torch pipelines#31809
Fix pipeline tests - don't set torch_dtype on non-torch pipelines#31809amyeroberts wants to merge 1 commit intohuggingface:mainfrom
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
Hi @amyeroberts Thanks for this.
But if we look the definition of @property
def torch_dtype(self) -> Optional["torch.dtype"]:
"""
Torch dtype of the model (if it's Pytorch model), `None` otherwise.
"""
return getattr(self.model, "dtype", None)So we are not actually testing properly with fp32/fp16 as we expect. (I have to verify to make sure). This is important to address, either in this PR or another.
|
| supported_frameworks = ["pt", "tf"] | ||
|
|
||
| def run_task_tests(self, task, torch_dtype="float32"): | ||
| def run_task_tests(self, task, torch_dtype=None): |
There was a problem hiding this comment.
yes the changes (themselves) make more sense. But see my overall comments.
|
I confirm the 2nd point in the above comment: we are not testing the torch_dtype properly, by
It shows (Pdb) pipe.model.dtype
torch.float32More details |
What does this PR do?
Fixes errors introduced by the merge of #31342
Example failing run: https://app.circleci.com/pipelines/github/huggingface/transformers/97170/workflows/fd23720b-5884-42d5-8877-e69afcdeda34/jobs/1285703