Graph Collaborate Attention Network
GCAT | H@1 | H@10 | MR | MRR |
---|---|---|---|---|
FB15K | 70.08 | 91.64 | 38 | 0.784 |
FB15k-237 | 36.06 | 58.32 | 211 | 0.435 |
WN18RR | 35.12 | 57.01 | 1974 | 0.430 |
Root
├── data
│ └── {dataset*} // Dataset
│ │ ├── train.txt
│ │ ├── test.txt
│ │ └── valid.txt
├── output
│ ├── {dataset*} // Result training of each dataset
│ │ ├── WN18RR_cuda_gat_3599.pt ~ "{dataset}_{device}_{model-name}_{last-epoch}"
│ │ └── WN18RR_cuda_result.txt
├── config.json # Config for traning
└── *.py # Source code
└── README.md
Public Colab : https://drive.google.com/file/d/1uVd_w6vE5C70rmgKLI7BvnhCWegXTMhk/view?usp=sharing
Using Google Colab with :
- Python
>= 3.6x
- Pytorch
>= 1.x
git clone https://github.com/hmthanh/GCAT.git
All config store in config.json
file
"dataset": "WN18RR", # Dataset
"data_folder": "./data",
"output_folder": "./output",
"save_gdrive": false, # Use Google Drive to save object
"drive_folder": "/content/drive/My Drive",
"cuda": false, # Use GPU to training
"epochs_gat": 1,
"epochs_conv": 1,
"weight_decay_gat": 5e-06,
"weight_decay_conv": 1e-05,
"pretrained_emb": false,
"embedding_size": 50,
"lr": 0.001,
"get_2hop": true,
"use_2hop": true,
"partial_2hop": false,
"batch_size_gat": 86835,
"valid_invalid_ratio_gat": 2,
"drop_GAT": 0.3,
"alpha": 0.2,
"entity_out_dim": [100, 200],
"nheads_GAT": [2, 2],
"margin": 5,
"batch_size_conv": 128,
"alpha_conv": 0.2,
"valid_invalid_ratio_conv": 40,
"out_channels": 500,
"drop_conv": 0.0
Because we use Google Colab for training, if you training with larger device, just run python main.py
- Step 1 : Create corpus
python 1_create_corpus.py
- Step 2 : Training embedding
python 2_training_encoder.py
- Step 3 : Training prediction
python 3_training_decoder.py
- Step 4 : Evaluation
python 4_evalution.py
- FB15k (Free Base)
- FB15k-237
- WN18 (Word Net)
- WN18RR
Email : [email protected] | [email protected]
- MIT license
- Copyright 2020 © Minh-Thanh Hoang.
GCAT was modify from KBGAT repos (https://github.com/deepakn97/relationPrediction )