-
Notifications
You must be signed in to change notification settings - Fork 5
Description
I attempted to train a single hidden layer MHN using this library. However, the library does not allow setting tau for each layer—it is fixed at 1. According to the HAM paper, this is not appropriate. In the single hidden layer case, training proceeds quite well. However, when extending to a multi-layer architecture, the tau values need to be set differently for each layer. When I extend to a multi-layer model and arbitrarily set tau values to train using backpropagation, the training does not work properly.
If anyone has successfully trained a multi-layer model using this library, or if anyone has experimented with varying tau in a single hidden layer and observed successful training, I would greatly appreciate any insights or tips on how to set tau and the number of time steps used in training (nsteps).