Skip to content

Add new Resampling enum to Pillow Image#8429

Closed
dynobo wants to merge 1 commit intopython:masterfrom
dynobo:master
Closed

Add new Resampling enum to Pillow Image#8429
dynobo wants to merge 1 commit intopython:masterfrom
dynobo:master

Conversation

@dynobo
Copy link

@dynobo dynobo commented Jul 28, 2022

Should solve #8366.

(My first little PR here 🙈 )

@dynobo
Copy link
Author

dynobo commented Jul 28, 2022

Oh, never mind. #8419 was faster and is more complete...

@dynobo dynobo closed this Jul 28, 2022
@github-actions
Copy link
Contributor

Diff from mypy_primer, showing the effect of this PR on open source code:

vision (https://github.com/pytorch/vision)
- torchvision/transforms/_pil_constants.py:7: error: Module has no attribute "Resampling"  [attr-defined]
- torchvision/transforms/_pil_constants.py:8: error: Module has no attribute "Resampling"  [attr-defined]
- torchvision/transforms/_pil_constants.py:9: error: Module has no attribute "Resampling"  [attr-defined]
- torchvision/transforms/_pil_constants.py:10: error: Module has no attribute "Resampling"  [attr-defined]
+ torchvision/transforms/_pil_constants.py:17: error: Incompatible types in assignment (expression has type "Literal[3]", variable has type "Resampling")  [assignment]
+ torchvision/transforms/_pil_constants.py:18: error: Incompatible types in assignment (expression has type "Literal[2]", variable has type "Resampling")  [assignment]
+ torchvision/transforms/_pil_constants.py:19: error: Incompatible types in assignment (expression has type "Literal[0]", variable has type "Resampling")  [assignment]
+ torchvision/transforms/_pil_constants.py:20: error: Incompatible types in assignment (expression has type "Literal[2]", variable has type "Resampling")  [assignment]

@hauntsaninja
Copy link
Collaborator

Just a little too slow, but thank you for contributing! :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants