Skip to content

unofgithub/cos484-prefix-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP Final Project: Prefix Tuning

Group project by Grace Wang, Nobline Yoo, Richard Zhu

Paper

Based significantly on code from Li and Liang 2021. The authors also provide a Codalab

Download and place additional folders/files into the necessary folders/directories (sources in brackets)

/e2e-metrics/pycocoevalcap/ (/pycocoevalcap/ here) /gpt2-medium-s3/ (will install when install dependencies in transformers folder with command "pip install transformers/" at root /)

To train the baseline, run

pip install transformers/ && python gpt2/train_e2e.py  --preseqlen 10 --learning_rate 0.00005 --seed 22 --epoch 10 --notes earlystop > stdout_baseline_rgn.txt

To run the prefix initialization ablation, run

pip install transformers/ && python gpt2/train_e2e.py  --preseqlen 10 --learning_rate 0.00005 --init_random table2text --seed 22 --epoch 10 --notes earlystop > stdout_baseline_rgn.txt

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages