Skip to content

matteogrieco/learn-BERT

Repository files navigation

learn-BERT

Learn BERT from scratch, pre-training practical notes

BERT is a pre-trained model proposed by Google AI Research in October 2018. At that time, it achieved SOTA results in 11 different NLP tasks. This rep records some of the pitfalls I stepped on when pre-training BERT from scratch.

About

Learn BERT from scratch, pre-training practical notes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published