-
Notifications
You must be signed in to change notification settings - Fork 11
Request for using TopClus on different pretrained language models #4
Copy link
Copy link
Open
Description
Hi,
I've read your paper and I like this approach. Thank you for sharing the code. I've one question regarding the pretrained language models (PLMs) that you use for getting the contextualized word representations. I saw in the source code that the model you use is fixed, and it's the classical 'bert-base-uncased':
Line 22 in 01e22fb
| pretrained_lm = 'bert-base-uncased' |
Suppose I'm interested on using this method on a corpus of italian texts. In that case, would it be possible to change this model and use a bert-base-multilingual-uncased instead?
If that's possible, can we make pretrained_lm a parameter of the TopClusTrainer?
Thank you.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels