Skip to content

Move AbstractGPUArrayStyle to GPUArraysCore?#417

Merged
maleadt merged 4 commits intoJuliaGPU:masterfrom
mcabbott:style
Jul 25, 2022
Merged

Move AbstractGPUArrayStyle to GPUArraysCore?#417
maleadt merged 4 commits intoJuliaGPU:masterfrom
mcabbott:style

Conversation

@mcabbott
Copy link
Contributor

RFC, I guess. But it would be useful to be able to dispatch on this, as for example here:

https://github.com/FluxML/Zygote.jl/blob/master/src/lib/broadcast.jl#L270

What order to tag things in may need careful thought. Does the main package needs an upper bound on the core in order not to create a clash, or can that be avoided somehow?

@maleadt
Copy link
Member

maleadt commented Jul 15, 2022

What order to tag things in may need careful thought. Does the main package needs an upper bound on the core in order not to create a clash, or can that be avoided somehow?

We're using @reexport, so I guess that could result in clashes. I wonder if we should better import the Core package instead, and reexport manually, to avoid this in the future.

Alternatively, we could use strict version requirements in GPUArrays (i.e. GPUArraysCore = "=0.2"). Maybe that's a simpler solution.

EDIT: you could try this on CI by splitting the move in two commits (having duplicate definitions for the first commit, only removing it in the second). We can then squash merge, but CI would test the scenario where there are two definitions from an upgraded Core package with an old GPUArrays.

@mcabbott
Copy link
Contributor Author

mcabbott commented Jul 15, 2022

That's a good idea. 98530a1 has the definition in both packages, and we can see what CI thinks.

I see no complaints. But, with this change, the following seems pretty odd to me:

julia> using Metal, GPUArraysCore

julia> @which Metal.AbstractGPUArrayStyle
GPUArrays

julia> @which GPUArraysCore.AbstractGPUArrayStyle
GPUArraysCore

julia> Metal.AbstractGPUArrayStyle === GPUArraysCore.AbstractGPUArrayStyle
false

It means that a package which uses GPUArraysCore.AbstractGPUArrayStyle won't work correctly if CUDA loads the old GPUArrays. I guess you can argue that ]up will never leave you in this state. But it seems better if the state is forbidden by Pkg.

@maleadt
Copy link
Member

maleadt commented Jul 18, 2022

Yeah, that's not great. I think the easiest solution is to lock GPUArrays to GPUArraysCore (i.e. edit the registry to have =0.1 and specify =0.2 now). What do you think?

@mcabbott
Copy link
Contributor Author

mcabbott commented Jul 18, 2022

Yes, I guess that every GPUArrays version demanding exactly one GPUArraysCore version sounds safest.

The only downside I can see is that, if you want to make some small change to GPUArraysCore, you will need to tag a new version of the main package before anyone can use it. But presumably there will be few such changes. And most PRs to GPUArrays won't alter GPUArraysCore at all.

This change can be GPUArraysCore 0.2, or 0.11. The argument for 0.2 is that it's not in fact a bugfix. The argument for 0.11 is that there's no need for anyone who already depends on GPUArraysCore to update their package for the new version. There seem to be 7 direct dependents now: https://juliahub.com/ui/Packages/GPUArraysCore/qiYUe/0.1.0?page=2

Tests here of GPUArrays do already load the latest Core version, not the registered one, which is good:

[0c68f7d7] GPUArrays v8.4.2 `~/work/GPUArrays.jl/GPUArrays.jl`
[46192b85] GPUArraysCore v0.1.1 `~/work/GPUArrays.jl/GPUArrays.jl/lib/GPUArraysCore`

@maleadt
Copy link
Member

maleadt commented Jul 18, 2022

There seem to be 7 direct dependents now: https://juliahub.com/ui/Packages/GPUArraysCore/qiYUe/0.1.0?page=2

That happened quickly! We better stick to 0.1.1 then.

@mcabbott
Copy link
Contributor Author

Ok, 0.1.1 it is.

I made JuliaRegistries/General#64447 to add a version bound. I guess that should be merged first, then GPUArraysCore tagged, then GPUArrays tagged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants