Replies: 1 comment 1 reply
-
|
The easiest method I've used is when loading a batch B of images, using the other B-1 non-positive images as negative examples, this only requires a single beton. Is this not suitable for your application? Code examples: # For example, produces image pairs of indices
# positive: [0, 0, 1, 1, 2, 2]
# negative: [1, 2, 0, 2, 0, 1]
for images in beton:
positive_indices = np.arange(BATCH_SIZE)
negative_indices = np.array([j for i in positive_indices for j in positive_indices if i != j])
positive_tensor = torch.repeat_interleave(images, BATCH_SIZE-1, dim=0)
negative_tensor = images[negative_indices]Or if you only have one type of embedding, this is simpler and more efficient: from pytorch_metric_learning.losses import NTXentLoss
loss_func = NTXentLoss(temperature=0.5, use_cosine_similarity=True)
emb = torch.randn([BATCH_SIZE, 128]) # Embeddings from your model
labels = torch.arange(BATCH_SIZE)
# PML function passes all combinations of pairs through the loss function
loss = loss_func(emb, labels)
# Or if you have two tensors of different embeddings
# loss = loss_func(emb_1, labels, ref_emb=emb_2, ref_labels=labels)Edit
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What is the easiest way to use contrastive learning with
ffcv? Let's say I want to train on cifar10. For my purposes, I generate the triplets myself (where each element in the triplet corresponds to the index of the image), and I want to use exactly these triplets.The ways I have in mind are too hacky, e.g. they'll require me to write my own
DatasetWriter(so that I can create datasets with fields likeimage1,image2,image3) orOrderOption(so that I canzipmultipleLoaders, where each loader corresponds to an element of the triplet) or something else. Is there a simple way?I see this discussion: #82, which is closed, but I don't see any information about how to actually do that.
Beta Was this translation helpful? Give feedback.
All reactions