-
Notifications
You must be signed in to change notification settings - Fork 18
What's the softmax temperature? #23
Copy link
Copy link
Open
Description
I'm building this from scratch to avoid the additional CPP code, and it seems to be working, however, when I compute
and use that as logits for the softmax, the "a" term on large images tend to be very small, and "f" is normalized, so the scalar product gets logits very close to 0, and since softmax is not scale invariant, I get close to uniform predictions...
Thus, I think that you are using some form of temperature, but nor in the code neither in the paper i see any reference to it... can I have some clarification?
At the moment the best I was able to to do (not to handpick the temperature) is to make it trainable:
class SoftmaxWithTemperature(tf.keras.layers.Layer):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.t = self.add_weight("temperature", (1,), initializer=tf.initializers.ones())
def call(self, inputs):
return tf.nn.softmax(self.t * inputs)
classification_network = tf.keras.models.Sequential([
SoftmaxWithTemperature()
])
classification_network(tf.reshape(features * probabilities, ...))And while training, I see the temperature slowly increasing to 1.2, which is still not enough (at least in my case)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels