-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
attention = Dense(1, activation='tanh')(char_embedding_dropout)
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(400)(attention)
attention = Permute([2, 1])(attention)
# apply the attention
representation = multiply([BiLSTM, attention])
我想问一下这个attention是在lstm之前还是之后,第一次接触还请解答
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels