Skip to content

KernelAbstractions 0.10 compat #592

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions .github/workflows/Test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,11 @@ jobs:
- uses: julia-actions/cache@v2
- name: Develop subpackages
run: |
julia --project -e "
julia --project -e '
using Pkg
Pkg.develop("KernelAbstractions")
Pkg.develop([PackageSpec(; name=basename(path), path) for path in ARGS])
" lib/GPUArraysCore lib/JLArrays
' lib/GPUArraysCore
- uses: julia-actions/julia-runtest@v1
continue-on-error: ${{ matrix.version == 'nightly' }}
- uses: julia-actions/julia-processcoverage@v1
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
[compat]
Adapt = "4.0"
GPUArraysCore = "= 0.2.0"
KernelAbstractions = "0.9.28"
KernelAbstractions = "0.10"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should be able to say 0.9.28, 0.10

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made it 0.10 only for testing because it requires less attention to notice that it errored than to verify that KA is actually using 0.10.

I’ll be re-adding it once it’s closer to being ready to merge.

LLVM = "3.9, 4, 5, 6, 7, 8, 9"
LinearAlgebra = "1"
Printf = "1"
Expand Down
4 changes: 2 additions & 2 deletions docs/src/interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ end

```

This will allow your defined type (in this case `JLArray`) to use the GPUArrays interface where available.
This will allow your defined type (in this case `CustomArray`) to use the GPUArrays interface where available.
To be able to actually use the functionality that is defined for `AbstractGPUArray`s, you need to define the backend, like so:

```julia
Expand All @@ -29,7 +29,7 @@ struct CustomBackend <: KernelAbstractions.GPU
KernelAbstractions.get_backend(a::CA) where CA <: CustomArray = CustomBackend()
```

There are numerous examples of potential interfaces for GPUArrays, such as with [JLArrays](https://github.com/JuliaGPU/GPUArrays.jl/blob/master/lib/JLArrays/src/JLArrays.jl), [CuArrays](https://github.com/JuliaGPU/CUDA.jl/blob/master/src/gpuarrays.jl), and [ROCArrays](https://github.com/JuliaGPU/AMDGPU.jl/blob/master/src/gpuarrays.jl).
There are numerous examples of potential interfaces for GPUArrays, such as with [CuArrays](https://github.com/JuliaGPU/CUDA.jl/blob/master/src/CUDAKernels.jl), [ROCArrays](https://github.com/JuliaGPU/AMDGPU.jl/blob/master/src/ROCKernels.jl), [MtlArrays](https://github.com/JuliaGPU/Metal.jl/blob/master/src/MetalKernels.jl).

## Caching Allocator

Expand Down
2 changes: 1 addition & 1 deletion docs/src/testsuite.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ If you don't want to run the whole suite, you can also run parts of it:


```julia
T = JLArray
T = Array # As of KernelAbstractions v0.10, Array uses POCLBackend to run KA kernels
GPUArrays.allowscalar(false) # fail tests when slow indexing path into Array type is used.

TestSuite.test_gpuinterface(T) # interface functions like gpu_call, threadidx, etc
Expand Down
1 change: 0 additions & 1 deletion test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
Distributed = "8ba89e20-285c-5b6f-9357-94700520ee1b"
JLArrays = "27aeb0d3-9eb9-45fb-866b-73c2ecf80fcb"
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Expand Down
5 changes: 4 additions & 1 deletion test/runtests.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using Pkg
Pkg.develop("KernelAbstractions")

using Distributed
using Dates
import REPL
Expand Down Expand Up @@ -47,7 +50,7 @@ include("setup.jl") # make sure everything is precompiled
# choose tests
const tests = []
const test_runners = Dict()
for AT in (JLArray, Array), name in keys(TestSuite.tests)
for AT in (Array,), name in keys(TestSuite.tests)
push!(tests, "$(AT)/$name")
test_runners["$(AT)/$name"] = ()->TestSuite.tests[name](AT)
end
Expand Down
5 changes: 2 additions & 3 deletions test/setup.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
using Distributed, Test, JLArrays
using Distributed, Test

include("testsuite.jl")

Expand All @@ -15,7 +15,7 @@ function runtests(f, name)
# generate a temporary module to execute the tests in
mod_name = Symbol("Test", rand(1:100), "Main_", replace(name, '/' => '_'))
mod = @eval(Main, module $mod_name end)
@eval(mod, using Test, Random, JLArrays)
@eval(mod, using Test, Random)

let id = myid()
wait(@spawnat 1 print_testworker_started(name, id))
Expand All @@ -24,7 +24,6 @@ function runtests(f, name)
ex = quote
GC.gc(true)
Random.seed!(1)
JLArrays.allowscalar(false)

@timed @testset $"$name" begin
$f()
Expand Down
Loading