Group project by Grace Wang, Nobline Yoo, Richard Zhu
Based significantly on code from Li and Liang 2021. The authors also provide a Codalab
Download and place additional folders/files into the necessary folders/directories (sources in brackets)
/e2e-metrics/pycocoevalcap/ (/pycocoevalcap/ here) /gpt2-medium-s3/ (will install when install dependencies in transformers folder with command "pip install transformers/" at root /)
To train the baseline, run
pip install transformers/ && python gpt2/train_e2e.py --preseqlen 10 --learning_rate 0.00005 --seed 22 --epoch 10 --notes earlystop > stdout_baseline_rgn.txtTo run the prefix initialization ablation, run
pip install transformers/ && python gpt2/train_e2e.py --preseqlen 10 --learning_rate 0.00005 --init_random table2text --seed 22 --epoch 10 --notes earlystop > stdout_baseline_rgn.txt