Skip to content

attention问题 #1

@wcy969354498

Description

@wcy969354498

attention = Dense(1, activation='tanh')(char_embedding_dropout)
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(400)(attention)
attention = Permute([2, 1])(attention)
# apply the attention
representation = multiply([BiLSTM, attention])
我想问一下这个attention是在lstm之前还是之后,第一次接触还请解答

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions