-
Notifications
You must be signed in to change notification settings - Fork 19
Open
Description
Hello,
I have used this package, its awesome
but I think there is an issue with NN.prototype.backPropagate
in gradient descent u should use the derivative of the activation function of the hidden layer, but in the code u don't use differentiate at all from the file activation, it seems that u put the derivative of sigmoid, for both hidden and output layer.... which is not always nice.
take an example like x^2 : if u train it on the settings u provided it will fail, because we need the output to be as it is ..
Metadata
Metadata
Assignees
Labels
No labels