This repo provides the source code for our paper "TuneUp: A Simple Improved Training Strategy for Graph Neural Networks".
Despite advances in GNNs, their training often simplistically minimizes loss across all graph nodes, ignoring the varying difficulty of prediction. We present TuneUp, a curriculum learning approach for enhanced GNN training. It employs a two-stage process: first, bolstering the base GNN's performance, and second, refining predictions for tail nodes. TuneUp is versatile, compatible with any GNN architecture and loss function, making it suitable for diverse prediction tasks.
Install the tuneupcommon
package:
pip install -e .
Go to the directory for each task:
cd nodecls/
cd linkpred/
cd dataset_preprocessing/recsys
python preprocess_lightgcn_datasets.py
cd ../../
cd recsys/
After you are in the directory for your task of interest, run all the experiments as follows:
seed=0
device=0
sh all_experiments.sh $seed $device
where you can vary the random seed
from 0
to 4
.
We used the following Python packages for core development. We tested on Python 3.8
.
pytorch 1.12.1+cu113
torch-geometric 2.2.0
tqdm 4.59.0