Skip to content

How to visualize the attention map #21

@its-dron

Description

@its-dron

I am attempting to visualize results, which is mostly handled by main.visualize(). However, the code to get the attention map has been commented out, and replaced with np.zeros.

My general question is what is the intuition behind the commented out code? Some specifics:

  • What is i_datum?
  • What is mod_layout_choice?
  • Why is att_blob_name created the way it is?

This will be helpful to understand, as we are also attempting to connect an additional model to the final attention map, pre softmax activation.
Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions