Skip to content

BatchInstanceNorm1d #1

@adrienchaton

Description

@adrienchaton

Hello,

I have been working on a project using a WAE with BatchNorm in the encoder and InstanceNorm in the decoder to have both representation/classification power for the encoder and the capacity of AdaIN for style conditioning in the decoder.

I choose this empirically throughout my experiments and then found out your paper which generalizes the possibility of mixing both.

I would like to try it and then make reference to your work. It was not a major feature of my research to separate the normalizations, just an implementation detail that I could replace with yours.

I replaced every 2d normalization in my networks by BatchInstanceNorm2d, no problem, it processes without error the 4D tensors (batch,chan,H,W).

However, I also had to change every 1d normalization by BatchInstanceNorm1d, which is for the 2D tensors output of linear layers. And there I have an error ...

I double checked the data, given a 2D tensor (90,1024) output of linear layer which is followed with BatchInstanceNorm1d(1024), I get the following:

/fast-2/adrien/virtualenv_p3/lib/python3.5/site-packages/torch/nn/functional.py in batch_norm(input, running_mean, running_var, weight, bias, training, momentum, eps)
1617 size_prods *= size[i + 2]
1618 if size_prods == 1:
-> 1619 raise ValueError('Expected more than 1 value per channel when training, got input size {}'.format(size))
1620
1621 return torch.batch_norm(

ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 92160])

could you help fixing this please ?
ran on python 3.5 and pytorch 1.0.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions