The code repo for Two Heads Are Better Than One: Exploiting Both Sequence and Graph Models in AMR-To-Text Generation.
cd pre-training
pip install -r requirements.txt
pip install amrlibchange the path in get_data.sh
sh get_data.shsilver data in silver_data/training/ available
cd fairseqchange the path in preprocess.sh
sh preprocess.shcd fairseqchange the path in pre-train.sh
sh pre-train.shcd fairseqchange the path in fine-tune.sh
sh fine-tune.shcd fairseqchange the path in train.sh
sh train.sh