-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to PartitionedArrays 0.4.0 #17
Comments
Hi, finally got around to update here. There is PartitionedArrays 0.5 now too, but I guess it is easier to upgrade to 0.4 first, so I started here: master...fe/parrays-0.4 I have a question around the new SplitMatrix type. To convert to HYPRE data format I need only the own_own and own_ghost part (i.e. all owned rows). Is there any functionality to loop over these two blocks or do I need to handle them separately like here: Lines 470 to 494 in d69e624
|
Hi @fredrikekre, nice to see this happening! I will strongly suggest to go directly with v0.5. The difference is not that much, and there are several bugs fixed in the latest version. You will provably need to go block by block explicitly: A::SplitMatrix
Aoo = A.blocks.own_own # This is a standard sequential sparse matrix
Aoh = A.blocks.own_ghost # the same The block are indexed with own/ghost ids depending the case. you need to map them by using the result from functions using PartitionedArrays
with_debug() do dist
ranks = LinearIndices((4,)) |> dist
rows = uniform_partition(ranks,(2,2),(8,8),(true,true))
foreach(rows) do myrows
@show own_to_global(myrows)
@show ghost_to_global(myrows)
end
end
|
It can also be useful to check how I wrapped the petsc ksp solvers here: https://github.com/fverdugo/PetscCall.jl/blob/main/src/ksp.jl |
Great, thanks for the quick reply, see #18. |
This patch upgrades the PartitionedArrays dependency from the 0.3.x release series to 0.5.x (closes #17). Also updates documentation build dependencies.
Hi there!
We released PartitionedArrays 0.4.x recently. I see that you are depending on the old version 0.3.x.
If you need some help with the update to 0.4.x, let me know. I don't have time to do the work myself unfortunately, but I will be happy to assist you, if needed.
Now, the interaction with external distributed sparse solvers should be easier. For reference, this is how I am wrapping a KSP solver from PETSc (it works with any numeration of global/local ids):
https://github.com/fverdugo/PetscCall.jl/blob/a74322dedbdf12ea7fe736612b6adda963cd57fd/src/ksp.jl#L244
Cheers,
Francesc
The text was updated successfully, but these errors were encountered: