Skip to content

Question about mutual information estimation #2

@Jason-cs18

Description

@Jason-cs18

Thanks for your awesome work! I'm very interested in mutual information estimation used in your paper.
According to Appendix-G and your finding (figure-6), you train an auxiliary classifier to estimate I(h, y) and end2end supervised training retains all task-relevant information.
From my perspective, it shows that we don't need to build a classifier upon the final feature map and deploying the classifier (trained on feature maps from many layers) to the first feature map is enough.
I'm not sure I understand this correctly. Would you help me clarify this?
Screenshot 2021-04-30 113406

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions