Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layers support for HeteroGraphConv #311

Open
10 of 18 tasks
CarloLucibello opened this issue Jun 20, 2023 · 6 comments
Open
10 of 18 tasks

Layers support for HeteroGraphConv #311

CarloLucibello opened this issue Jun 20, 2023 · 6 comments
Labels

Comments

@CarloLucibello
Copy link
Member

CarloLucibello commented Jun 20, 2023

HeteroGraphConv are build out of standard graph conv layers which are individually applied to the different relations.
The list of layers supporting integration with HeteroGraphConv should be extended.

  • AGNNConv
  • CGConv
  • ChebConv
  • EGNNConv
  • EdgeConv
  • GATConv
  • GATv2Conv
  • GatedGraphConv
  • GCNConv |
  • GINConv
  • GMMConv
  • GraphConv
  • MEGNetConv
  • NNConv
  • ResGatedGraphConv
  • SAGEConv
  • SGConv
  • TransformerConv
@codetalker7
Copy link

Hi @CarloLucibello. Is this a documentation issue? I looked at the implementation in src/layers/heteroconv.jl, and it seems like the implementation takes care of all standard graph conv layers (in the sense that, the layers field in HeteroGraphConv can be any GNNLayer, and the code should work for any layer type). Am I missing something?

@CarloLucibello
Copy link
Member Author

No it is actually an implementation issue that involves a tiny change to the forward pass of each layer. In the orginal PR (#300) it was done only for GraphConv.
One needs to relax the forward signature to take an AbstractGNNGraph instead of a GNNGraph and then insert at the beginning the line

xj, xi = expand_srcdst(g, x)

since during the hetero message passing x is a tuple containing the src and dst features (relative to two different node types).

@codetalker7
Copy link

Hi @CarloLucibello, another small question: for layers like AGNNConv which involve self loops; such layers will then have to operate on two types of edges, right? Because, we'll have to add self-loops for nodes of the target type (this is being done currently in the AGNNConv implementation: line 1018 of https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl).

Also, motivated by this: it seems like adding self loops to GNNHeteroGraphs is not supported. Will it be a good idea to add this functionality, wherein we can add self-loops for a particular node type?

@CarloLucibello
Copy link
Member Author

CarloLucibello commented Aug 27, 2023

When using something like AGNNConv inside an heterograph, the add_self_loop option should be set to false, since
it doesn't make sense to add self loops in a relation (node1_t, edge_t, node2_t), unless the two node types are the same.

@AarSeBail
Copy link
Contributor

Perhaps an additional HANConv layer would be helpful to add to this list. https://arxiv.org/abs/1903.07293

@AarSeBail
Copy link
Contributor

AarSeBail commented Jan 7, 2024

Am I correct in my thinking that these implementations for heterographs should be "type blind"?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants