Native support of Windows #16124
Replies: 3 comments
-
|
Sure. CUDA support for Windows (via the AOTI backend) is actively being worked on. @JacobSzwejbka or @Gasoonjia might be able to provide more details. For Vulkan, we don't currently cover it in CI but it might be buildable. We can likely look at adding CI coverage and closing any gaps. CC @SS-JIA |
Beta Was this translation helpful? Give feedback.
-
|
There is experimental support for some LLM architectures (whisper, voxtral, gemma) using the cuda backend. I'm writing a readme for the windows path right now (its really similar to linux though). The Cuda backend currently does not support partitioning so it requires your model to be totally lowerable to AoTI (basically dont include any ops that aoti doesnt know how to handle) which makes it really fragile to use. We are actively working on fixing this. |
Beta Was this translation helpful? Give feedback.
-
|
@larryliu0820 has a wip PR trying out our Vulkan delegate on desktop though I think he was messing with Linux and not windows. We also aren't expecting perf to be very good right now since historically all dev work on our Vulkan backend has been mobile gpu focused. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Can the backend on Windows now (without using WSL) support Vulkan or CUDA? If not, when will it be supported?
Beta Was this translation helpful? Give feedback.
All reactions