Skip to content

RuntimeError: Given groups=1, weight of size [48, 37, 11], expected input[8, 691, 18] to have 37 channels, but got 691 channels instead #61

@Abdelsater

Description

@Abdelsater

I have the following setup and I actually changed the d_input from 38 to 37

`# Training parameters
DATASET_PATH = 'output.npz'
BATCH_SIZE = 8
NUM_WORKERS = 4
LR = 1e-4
EPOCHS = 30

Model parameters

d_model = 48 # Lattent dim
N = 2 # Number of layers
dropout = 0.2 # Dropout rate

d_input = 37 # From dataset
d_output = 8 # From dataset

Config

sns.set()
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
print(f"Using device {device}")
`

My dataset has the following shape when converting the ozedataset from csv to npz :
[('R', (7500, 19), dtype('float32')), ('X', (7500, 8, 672), dtype('float32')), ('Z', (7500, 18, 672), dtype('float32'))]

but when I am running the benchmark of the transformer repo I am getting the following error when I train :
`[Epoch 1/30]: 0%| | 0/5500 [00:00<?, ?it/s]
torch.Size([8, 18, 691])
torch.Size([8, 8, 672])

RuntimeError Traceback (most recent call last)
Cell In[6], line 16
14 print(x.shape)
15 print(y.shape)
---> 16 netout = net(x.to(device))
18 # Comupte loss
19 loss = loss_function(y.to(device), netout)

File ~/miniconda3/envs/Test/lib/python3.10/site-packages/torch/nn/modules/module.py:1194, in Module._call_impl(self, *input, **kwargs)
1190 # If we don't have any hooks, we want to skip the rest of the logic in
1191 # this function, and just call forward.
1192 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1193 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1194 return forward_call(*input, **kwargs)
1195 # Do not call functions when jit is used
1196 full_backward_hooks, non_full_backward_hooks = [], []

File ~/Implementations/Transformers/OzeChallenge/Original/transformer/src/benchmark.py:121, in ConvGru.forward(self, x)
119 def forward(self, x):
120 x = x.transpose(1, 2)
--> 121 x = self.conv1(x)
122 x = self.activation(x)
123 x = self.conv2(x)

File ~/miniconda3/envs/Test/lib/python3.10/site-packages/torch/nn/modules/module.py:1194, in Module._call_impl(self, *input, **kwargs)
1190 # If we don't have any hooks, we want to skip the rest of the logic in
1191 # this function, and just call forward.
1192 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1193 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1194 return forward_call(*input, **kwargs)
1195 # Do not call functions when jit is used
1196 full_backward_hooks, non_full_backward_hooks = [], []

File ~/miniconda3/envs/Test/lib/python3.10/site-packages/torch/nn/modules/conv.py:313, in Conv1d.forward(self, input)
312 def forward(self, input: Tensor) -> Tensor:
--> 313 return self._conv_forward(input, self.weight, self.bias)

File ~/miniconda3/envs/Test/lib/python3.10/site-packages/torch/nn/modules/conv.py:309, in Conv1d._conv_forward(self, input, weight, bias)
305 if self.padding_mode != 'zeros':
306 return F.conv1d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
307 weight, bias, self.stride,
308 _single(0), self.dilation, self.groups)
--> 309 return F.conv1d(input, weight, bias, self.stride,
310 self.padding, self.dilation, self.groups)

RuntimeError: Given groups=1, weight of size [48, 37, 11], expected input[8, 691, 18] to have 37 channels, but got 691 channels instead`

I know I have to use rollaxis to get my input in the following shape :
_x.shape = torch.Size([7500, 672, 37]) _y.shape = torch.Size([7500, 672, 8])

could you please help me with it , I am a bit confused !!

Thank you in advance

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions