Skip to content

jscuds/rf-bert

Repository files navigation

rf-bert

Repo for BERT implementation of "Retrofitting Contextualized Word Embeddings with Paraphrases" [1] and further improvement of BERT robustness.

Forthcoming Contributions

  1. Retrofitted-BERT in PyTorch
  2. Experiments to test rf-BERT robustness under various adversarial attacks

References

[1]

@inproceedings{shi-etal-2019-retrofitting,
    title = {Retrofitting Contextualized Word Embeddings with Paraphrases},
    author = {Shi, Weijia and Chen, Muhao and Zhou, Pei and Chang, Kai-Wei},
    booktitle = {Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)},
    year = {2019},
    url = {https://aclanthology.org/D19-1113},
    pages = {1198--1203}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •