-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Differences to GeometricFlux.jl? #2
Comments
Indeed, I wasn't expecting it already on day 1 though :)
The scope is the same, I just needed another name :) While geometric deep learning is broader than GNNs (you also have CNNs, deep learning on manifolds, and possibly other stuff I don't know about), in practice GeometrixFlux.jl, and also python's PytorchGeometric and DGL, are only about GNNs. Here I'm trying to address some major (in my opinion) design issues in GeometricFlux that I couldn't fix there since I couldn't find a common ground with the author. For background, see The main differences at the moment are the following (this list may evolve in the near future)
Having 2 GNN libraries in julia instead of joining efforts is probably not ideal, but I really think these changes will benefit the ecosystem, so here we are. I hope there will be room for collaboration down the road. |
Ok, that makes sense. Thanks for explaining FYI @Wimmerer is doing a lot of interesting work on Sparse matrices, Graphs, Graphblas and eventually custom sparse codegen: mcabbott/Tullio.jl#114 |
How decoupled can this be? Can layers support an arbitrary AbstractSparse in the future? How are you implementing the core message passing operations? How are you storing features? Node features in a dense matrix I imagine, what about edge features? |
@Wimmerer Yes, they already can in principle, provided linear algebra, map and broadcasting operations are supported by the sparse matrix type. I opened #19 to experiment with GBMatrix, many tests are failing at the moment, GBMatrix doesn't seem ready as a drop-in replacement for SparseMatrixCSC but I didn't look into the details yet. Feel free to experiment on that branch! |
The most generic message passing scheme is based on a Many commonly used specific schemes can leverage algebraic operations on the adjacency matrix though. An example An equivalent forward pass for the
Node features are stored as |
Here's the inevitable question ;)
What are the differences (philosophical, implementation etc) between this and geometric flux?
Are you covering a smaller scope? I think graphs are a subset of geometric deep learning
The text was updated successfully, but these errors were encountered: