Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistencies with the paper #13

Open
razvanc92 opened this issue Jul 7, 2020 · 1 comment
Open

Inconsistencies with the paper #13

razvanc92 opened this issue Jul 7, 2020 · 1 comment

Comments

@razvanc92
Copy link

razvanc92 commented Jul 7, 2020

Hello, firstly I would like to thank you for sharing the code. I was looking at the Spatial Attention component (line 56 in model.py) and I've noticed some differences from what is presented in the paper:

  1. I can not find where you're splitting the vertices into G partitions (and doing the intra/inter group attention). As far as I can understand the spatialAttention function does only the intra-group spatial attention without any restrictions.
  2. After you're computing eq 7 (line 86 in model.py) the output is projected again using 2 FC layers, which in the paper are not described. What is the reason for it?
  3. Looking at eq 7 the input of function f3 is the previous hidden representation where in you're code you're also using the static graph embeddings (e_{v,tj})

Looking forward for your reply.

@wdzhong
Copy link

wdzhong commented Aug 4, 2020

  1. Have the same doubt. I couldn't fully understand the inter-group spatial attention described in the paper so I try to see how the code works. But I cannot find anything related. There is even no G in the code.
  2. I think the FC is used to change the dimension of the attention results so that the new hidden states can have the same dimension as the previous hidden states.
  3. From the code
    def spatialAttention(X, STE, K, d, bn, bn_decay, is_training):
    here X is the hidden state and STE is the spatio-temporal embedding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants