Skip to content

Why is there a small coefficient for the standard deviation in the reparameterization trick? #3

@Isuxiz

Description

@Isuxiz

CVAR/model/warm.py

Lines 362 to 363 in e882d50

z = mean + 1e-4 * torch.exp(log_v * 0.5) * torch.randn(mean.size()).to(self.device)
z_p = mean_p + 1e-4 * torch.exp(log_v_p * 0.5) * torch.randn(mean_p.size()).to(self.device)

  1. What is the meaning of 1e-4 here?
  2. (Critical question) Does this restriction on the standard deviation weaken the distribution so that it degenerates to a fixed-point representation of the mean?

I removed the coefficient of 1e-4 (i.e. the coefficient was changed to 1) and conducted the experiment again, found that the effect dropped significantly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions