Repository for the paper "Incorporating Attribution Importance for Improving Faithfulness Metrics" to appear at ACL 2023.
We promise the repo will be tided up shortly.
Install necessary packages by using the files conda_reqs.txt and pip_reqs.txt
conda create --name flexi --file conda_reqs.txt
conda activate flexi
pip install -r pip_reqs.txt
python -m spacy download en_core_web_sm
You can run the jupyter notebooks found under tasks/task_name/*ipynb to generate a filtered, processed csv file and a pickle file used for trainining the models.
dataset="evinf"
data_dir="datasets/"
model_dir="trained_models/"
for seed in 5 10 15
do
python finetune_on_ful.py --dataset $dataset \
--model_dir $model_dir \
--data_dir $data_dir \
--seed $seed
done
python finetune_on_ful.py --dataset $dataset \
--model_dir $model_dir \
--data_dir $data_dir \
--seed $seed \
--evaluate_models
divergence="jsd"
extracted_rationale_dir="extracted_rationales/"
python extract_rationales.py --dataset $dataset \
--model_dir $model_dir \
--data_dir $data_dir \
--extracted_rationale_dir $extracted_rationale_dir \
--extract_double \
--divergence $divergence
--extract_double
is optional and is used to double
extracted_rationale_dir="extracted_rationales/"
evaluation_dir="faithfulness_metrics/"
python evaluate_posthoc_zeroout.py --dataset $dataset \
--model_dir $model_dir \
--extracted_rationale_dir $extracted_rationale_dir \
--data_dir $data_dir \
--evaluation_dir $evaluation_dir\
--thresholder $thresh