-
Notifications
You must be signed in to change notification settings - Fork 17
Description
Hi @ewrfcas ,
Thanks for sharing your code about the Keras implementation of ArcFace loss. I recently use this loss to train my model but found something that really confuses me.
In the code below, y_mask =+ K.epsilon() will make y_mask always be equal to K.epsilon(), which is default to 1e-7. This makes the whole loss be equivalent to Softmax since the term cos_tm_temp * y_mask has been almost eliminated.
On the other side, I tried deleting this line to make y_mask become the one-hot form true label, but then the loss become a constant so that the weights & biases do not updates anymore.
So I wonder if you have some advice on this. BTW, do you know how to print out the intermediate values in a custom loss layer? I debugged the loss function line by line and it seems right, while the outcomes when training is not. :(
Machine-Learning-Toolbox/loss_function/ArcFace_loss.py
Lines 55 to 58 in 127d6e5
| y_mask =+ K.epsilon() | |
| inv_mask = 1. - y_mask | |
| s_cos_theta = self.s * cos_theta | |
| output = K.softmax((s_cos_theta * inv_mask) + (cos_tm_temp * y_mask)) |