-
Notifications
You must be signed in to change notification settings - Fork 43
Open
Description
Thank you for contributing to the community with your excellent research and code sharing. I have a question regarding the initialization of the projection head.
It appears that the head of the pre-trained encoder (e.g., model.head = build_mlp(3, hidden_dim, 4096, proj_dim) in line 115 of rcg/pretrained_enc/models_pretrained_enc.py) is randomly initialized and not trained. However, the checkpoint pretrained_enc_ckpts/mocov3/vitb.pth.tar seems to include the encoder head as well. Was the head in pretrained_enc_ckpts/mocov3/vitb.pth.tar also randomly initialized? Was there an intentional reason for setting the randomly initialized head to remain untrained?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels