-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU Minimal Working Example #197
Conversation
This honestly isn't as bad as I was expecting -- very nice work. Looking at I'll try to take a look at this in more depth at some point, quite busy at the minute so unless it's blocking other work, I'll hold off for now. |
Codecov Report
@@ Coverage Diff @@
## master #197 +/- ##
==========================================
- Coverage 97.98% 93.71% -4.28%
==========================================
Files 10 11 +1
Lines 348 366 +18
==========================================
+ Hits 341 343 +2
- Misses 7 23 +16
Continue to review full report at Codecov.
|
* return lazy Fill arrays for mean functions from #197 * replace Diagonal(Fill(...)) with PDMats.ScalMat * add tr_Cf_invΣy method for ScalMat & tests * _cholesky for ScalMat * update Manifests * _symmetric(X::AbstractPDMat) = X
NOT INTENDED FOR MERGE
This PR is meant to give an idea of what work is needed to allow AbstractGPs to run on GPU (although with no regard for performance).
The main changes required are:
In AbstractGPs
mean
functions need to return lazy Fill arrays forZeroMean
andConstMean
(should probably do this anyway?)Diagonal{<:Real,<:FillArrays.Fill}}
will produce anArray
, not aCuArray
(i.e. when computingcov(fx) = cov(fx.f, fx.x) + fx.Σy
). I don't think there's an easy way to fix this in general, so either we don't use a lazy Fill array here - which is what I've implemented - or we define a custom_add_broadcasted
function which overrides the default broadcasting logic.In KernelFunctions
Distances.jl
is not at all GPU compatible, so custom implementations ofpairwise
/colwise
are needed (obviously, this should be done inKernelFunctions
, but I've included it here to keep it in one place).Vector
fields and add vectors constructors KernelFunctions.jl#299See also JuliaGaussianProcesses/ApproximateGPs.jl#15