Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds GATv2 layer #259

Merged
merged 4 commits into from
Feb 20, 2022
Merged

Adds GATv2 layer #259

merged 4 commits into from
Feb 20, 2022

Conversation

abieler
Copy link
Contributor

@abieler abieler commented Jan 2, 2022

IDK if its worth introducing a new abstact type for the two GAT layers as some of the functions are identical.

Could replicate the results from Figure 1 in the paper regarding the attention scores:

GATConv Layer
gat-attention-scores

GATv2 Layer
gatv2-attention-scores

note the difference is that here we have self-loops required, which results in the 1st alpha score not being 0 for all queries.

@yuehhua
Copy link
Member

yuehhua commented Jan 4, 2022

Would you mind adding some tests?

@yuehhua
Copy link
Member

yuehhua commented Feb 20, 2022

@abieler Thank you for contribution. I will merge this first then fix the tests.

@yuehhua yuehhua merged commit abcb235 into FluxML:master Feb 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants