Multi modal (Just libraries)#1220
Conversation
|
It looks like llama.cpp no longer offers |
|
Test run for CUDA 12.4 only started here: https://github.com/martindevans/LLamaSharp/actions/runs/16153517357, will PR that if it passes, then hopefully this PR should compile too |
|
I've created a pull request to this branch (see here). It changes over to CUDA 12.4 build. |
Switched to CUDA 12.4 build only
|
I'm going to merge this and kick off a test build. |
|
@martindevans, I will be working on the changes during the next days. |
|
The test run (https://github.com/SciSharp/LLamaSharp/actions/runs/16231321874) mostly passed, just a minor issue with file naming at the end. So we've got a working CUDA build for all your bit whenever they're ready :) |
This are the changes to build and copy the right dynamic libraries as first step to introduce code changes.
@martindevans, CUDA compilation in Windows doesn´t work. If I see the problem correctly it requires CUDA 12.4. But I'm not sure, and I cannot test it on Windows.