Skip to content

attention weights should be reordered as well as the outputs in case inputs were sorted and packed when doing batch inference #25

@levhaikin

Description

@levhaikin

if reorder_output:

during inference in batches, if reorder_output becomes True (if input_seqs are not of PackedSequence), then outputs are correctly reordered to match input order.

however, if return_attention is requested (set to True) then it is (and that's the bug) returned in an order that does not match the inputs.

a possible fix could be in the following section in the code:

if reorder_output:

similarly to how outputs are reordered - we can add the attention reordering code under that same if statement:

reordered_for_weights = Variable(att_weights.data.new(att_weights.size()))
reordered_for_weights[perm_idx] = att_weights
att_weights = reordered_for_weights

does that make sense?
am i missing anything?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions