Skip to content

releases : update ROCM, add gfx1200, gfx1201, gfx1151#15972

Merged
slaren merged 4 commits intomasterfrom
sl/gfx-1200-release
Sep 14, 2025
Merged

releases : update ROCM, add gfx1200, gfx1201, gfx1151#15972
slaren merged 4 commits intomasterfrom
sl/gfx-1200-release

Conversation

@slaren
Copy link
Copy Markdown
Member

@slaren slaren commented Sep 13, 2025

Fixes #15106

@github-actions github-actions Bot added the devops improvements to build systems and github actions label Sep 13, 2025
@slaren slaren merged commit 9ecb884 into master Sep 14, 2025
3 checks passed
@slaren slaren deleted the sl/gfx-1200-release branch September 14, 2025 09:22
@lcy0321
Copy link
Copy Markdown
Contributor

lcy0321 commented Sep 14, 2025

Hi experts,

This might be a noob question, but I believe this PR changes the HIP SDK to use ROCm 6.4.2 (as referenced here: https://www.amd.com/en/developer/resources/rocm-hub/hip-sdk.html).

However, we are still using the rocm-6.2.4 branch of rocWMMA:

git clone https://github.com/rocm/rocwmma --branch rocm-6.2.4 --depth 1

And we’re using rocm-6.1- as the cache key:

key: rocm-6.1-${{ runner.os }}-v1
restore-keys: |
rocm-6.1-${{ runner.os }}-

Is this expected?

@CISC
Copy link
Copy Markdown
Member

CISC commented Sep 14, 2025

I don't think the new SDK was installed btw as it used the cached installation (I've deleted the cache on master now).

@lcy0321
Copy link
Copy Markdown
Contributor

lcy0321 commented Sep 14, 2025

Hi @CISC,
I've created a PR to fix the cache keys: #15984.
Could you please take a look?

blime4 referenced this pull request in blime4/llama.cpp Feb 5, 2026
* releases : update ROCM, add gfx1200, gfx1201, gfx1151

* releases : set target to 13.3 for macos-x64

* add hipblaslt.dll to release

* add hipblaslt/library to release
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
* releases : update ROCM, add gfx1200, gfx1201, gfx1151

* releases : set target to 13.3 for macos-x64

* add hipblaslt.dll to release

* add hipblaslt/library to release
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

devops improvements to build systems and github actions

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9070XT can't not use amd llama.cpp

5 participants