Skip to content

Difficulty Training Multi-Layer MHN with Custom Tau Settings #5

@WOOYULJUNG

Description

@WOOYULJUNG

I attempted to train a single hidden layer MHN using this library. However, the library does not allow setting tau for each layer—it is fixed at 1. According to the HAM paper, this is not appropriate. In the single hidden layer case, training proceeds quite well. However, when extending to a multi-layer architecture, the tau values need to be set differently for each layer. When I extend to a multi-layer model and arbitrarily set tau values to train using backpropagation, the training does not work properly.

If anyone has successfully trained a multi-layer model using this library, or if anyone has experimented with varying tau in a single hidden layer and observed successful training, I would greatly appreciate any insights or tips on how to set tau and the number of time steps used in training (nsteps).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions