-
Hi, I am currently generating embedding using g.embed method using the following code block:
I then use for example emb[0] to get the embedding associated with node 1. Is this the correct interpretation of what g.emd produces? I am using these embeddings for a downstream classification task where I join these embeddings with other features based on index key. Many thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @vadhoob, thanks for your interest in using PecanPy! In this case, the index of The relevant code is as follows PecanPy/src/pecanpy/pecanpy.py Line 285 in a2b5f7d where |
Beta Was this translation helpful? Give feedback.
Hi @vadhoob, thanks for your interest in using PecanPy! In this case, the index of
emd
will be consistent with the graphg
, i.e., the first row in theemd
matrix correspond to the embedding vector of the first node ing
.The relevant code is as follows
PecanPy/src/pecanpy/pecanpy.py
Line 285 in a2b5f7d
where
w2v
is the trained word2vec object, andw2v.wv
is the corresponding keyedvectors. This keyedvectors can be indexed by the corresponding words (i.e., node IDs in the graph). Thus, using the node IDsself.nodes
of the graphg
viaw2v.wv[self.nodes]
produces an embedding matrix whose node order is consistent with the graph.