You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 28, 2022. It is now read-only.
"I'll build a network with `nn.Sequential` here. Only difference from the last part is I'm not actually using softmax on the output, but instead just using the raw output from the last layer. This is because the output from softmax is a probability distribution. Often, the output will have values really close to zero or really close to one. Due to [inaccuracies with representing numbers as floating points](https://docs.python.org/3/tutorial/floatingpoint.html), computations with a softmax output can lose accuracy and become unstable. To get around this, we'll use the raw output, called the **logits**, to calculate the loss."
"Often, the output will have values really close to zero or really close to one. Due to inaccuracies with representing numbers as floating points, computations with a softmax output can lose accuracy and become unstable. To get around this, we'll use the raw output, called the logits, to calculate the loss."
The description for the nn.CrossEntropyLoss() say: "This criterion combines :func:nn.LogSoftmax and :func:nn.NLLLoss in one single class.".
I'm a bit confused about this explanation, since the nn.CrossEntropyLoss() criterion contains a nn.LogSoftmax function.