-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about the data #3
Comments
The drug indication vectors are just multi-hot vectors. Each dimension is an indication. |
Thank you for your reply.I will try to calculate the feature vectors as you said. @matenure |
Hi @bbjy , I am also facing same dataset problem. Have you found any way to run cora or any example dataset with this network? Looking forward to your reply. Thank you:) |
@Abhinav43 Sorry, I failed to run this network. |
@bbjy what was the error you were getting? I am facing attention error :
I think there is any error or is it possible to concat with 0? |
@Abhinav43 attentions = tf.concat(0, attention) is not concatenating with 0, but concatenating attention along the first dimension:) |
Hi Ma, thanks for your reply. I gone through your code and I am facing some issues : i ) If I am doing this way it's working :
but if I am doing this way as shown in code then I am getting error :
ii ) I got your point tf.concat(0,attention) is attention along the first dimension , but tf.concat documentation says if should have two matrix to concat and axis parameter is given after matrix value so the syntax is :
So in tf.concat(0,attention) , attention is a list and where is the second matrix whom with we are concatenating? also why you have passes axis = 0 as values argument in tf.concat? ii ) so suppose our hidden_1 = 64 , hidden_2 = 16 , output_dim = 4 adj shape : (222, 222) input_dim = features.shape[1] where input_dim is the same as feature_matrix's first dim
I am confuse to calculate the shapes in network flow can you show me the flow of network with shapes ? Thank you , I am sorry if I am troubling a lot :) Waiting for your reply. |
i) I guess some of the errors are due to the version update of tensorflow. For example, tf.concat(0, attention) is the correct usage in previous version, but now I think the api has been changed (I have not used tensorflow for a while, now I am mainly using pytorch) ii)output_dim = input_dim , here the input_dim is 222 instead of 582. The other shapes seem correct |
Thank you for reply :) |
Hello @matenure , I ran into an issue related to the dimensions too and hence didn't chose to create a new issue, Hope that is fine with you. The issue here is, the shape or dimensions used for the attention weights (and hence the size of the attention weights) is equal to the dimensions of the labels as as seen in the AttSemiGraphEncoder class. But while computing the attention scores, matrix multiplication is being performed with individual adjacency matrices, which are square matrices where num_of_rows = num_of_cols = num_of_Drugs. Just to illustrate this, if the drug labels are dimension 6 vectors and the number of drugs is 1000. The attention weights are of dimension 6 right now, while it should be of dimension 1000. P.S. great paper by the way 👍 |
@SudhirGhandikota You are right, the dimensions of the attention weights are equal to the number of nodes/drugs. And in fact the code "self.output_dim = placeholders['labels'].get_shape().as_list()[1]" did this:) Because in our codes we did not discriminate the different types of DDIs, and our label is a vector with length of the number of nodes (each dimension means whether this drug connects to a target drug). I know It is not a very elegant way... |
@matenure Thanks for the confirmation 👍 |
Is there a pytorch version available? |
@rojinsafavi Unfortunately we do not have a PyTorch version. |
Hi, thanks for your code.
For the multiview similarity in the paper, for example, "Drug Indication" in the "Multilabel Prediction of Specific DDI Types", could you please tell me that how do you embed a drug into the 1702 dimension embedding vector, as well as others?
Looking forward to your reply. Thank you!
The text was updated successfully, but these errors were encountered: