Skip to content

snap-stanford/tuneup

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TuneUp

This repo provides the source code for our paper "TuneUp: A Simple Improved Training Strategy for Graph Neural Networks".

Motivation and Overview of TuneUp

Despite advances in GNNs, their training often simplistically minimizes loss across all graph nodes, ignoring the varying difficulty of prediction. We present TuneUp, a curriculum learning approach for enhanced GNN training. It employs a two-stage process: first, bolstering the base GNN's performance, and second, refining predictions for tail nodes. TuneUp is versatile, compatible with any GNN architecture and loss function, making it suitable for diverse prediction tasks.

Usage

Install the tuneupcommon package:

pip install -e .

Go to the directory for each task:

Semi-supervised node classification

cd nodecls/

Link prediction

cd linkpred/

Recommender systems

cd dataset_preprocessing/recsys
python preprocess_lightgcn_datasets.py
cd ../../
cd recsys/

After you are in the directory for your task of interest, run all the experiments as follows:

seed=0
device=0
sh all_experiments.sh $seed $device

where you can vary the random seed from 0 to 4.

Installation

We used the following Python packages for core development. We tested on Python 3.8.

pytorch                   1.12.1+cu113
torch-geometric           2.2.0
tqdm                      4.59.0

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published