Skip to content

Fix a small issue when updating the memory#4

Open
LiUzHiAn wants to merge 3 commits intocvlab-yonsei:masterfrom
LiUzHiAn:fix_mem_update
Open

Fix a small issue when updating the memory#4
LiUzHiAn wants to merge 3 commits intocvlab-yonsei:masterfrom
LiUzHiAn:fix_mem_update

Conversation

@LiUzHiAn
Copy link

@LiUzHiAn LiUzHiAn commented Aug 16, 2020

@hyunjp Hi,

Thanks for your nice work! As I dived into the source code, I just noticed a small issue when we updating the memory as the operations you provided in your paper. To be more concrete, that's formula (5) in the Update part of section 3.1.1. And the corresponding code you wrote is >>here<<.

I noticed that the score[:,i] just represents the similarities between i-th memory item and all the queries. But what we really want to get is the part of queries, whose closet memory item are exact the i-th memory item. So, score[:,i] should be modified into score[idx,i]. You can refer to
$\max {k^{\prime} \in U{t}^{m}} v_{t}^{k^{\prime}, n}$ in formula (5).
And I clean up the code about the memory updating part. To be short, here are what I've done:

  • fix the small bug as I pointed out above;
  • clean up the code, specifically, I discard the unused param update_indices and unneeded param train in function get_update_query() and clean the implementation;
  • modify the Evaluate.py correspondingly.

If I was wrong, please correct me. Thanks in advance. 😸

@bo-miao
Copy link

bo-miao commented Aug 27, 2020

I have taken a review at the official code and thought about this issue.

I think it is hard to say that score[:,i] should be replaced with score[idx,i], because score is normalized (softmax, dim=0) over all patches instead of the chosen idx patches. In that case, use score[idx, i] / torch.max(score[idx, i]) to normalize sounds more reasonable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants