Skip to content

Fix is_torch_neuroncore_available#41105

Draft
cyyever wants to merge 5 commits intohuggingface:mainfrom
cyyever:is_torch_neuroncore_available
Draft

Fix is_torch_neuroncore_available#41105
cyyever wants to merge 5 commits intohuggingface:mainfrom
cyyever:is_torch_neuroncore_available

Conversation

@cyyever
Copy link
Copy Markdown
Contributor

@cyyever cyyever commented Sep 23, 2025

What does this PR do?

The argument is ignore.

@cyyever cyyever changed the title Use check_device of is_torch_neuroncore_available Fix is_torch_neuroncore_available Sep 23, 2025
@Rocketknight1
Copy link
Copy Markdown
Member

Hmm, I'm not sure about this one! If you read the source for is_torch_xla_available, check_is_gpu overrides check_is_tpu, so there's no point setting both to True

@cyyever
Copy link
Copy Markdown
Contributor Author

cyyever commented Sep 24, 2025

@Rocketknight1 Ok, let's just set check_is_gpu

@Rocketknight1
Copy link
Copy Markdown
Member

I'm still uncertain, though - I don't know much about neuroncore, so I don't know if those devices type-check as GPUs or TPUs or what. That makes me a bit nervous about modifying this function!

@cyyever
Copy link
Copy Markdown
Contributor Author

cyyever commented Sep 25, 2025

@Rocketknight1 Delay it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants