-
Notifications
You must be signed in to change notification settings - Fork 24
What is your specific enviroment? #22
Description
I failed to run the distillation code since an unclear timm or a numpy version, or maybe others.
I'm wondering the specific version of the environment, such as timm and numpy.
Here are my errors when distilling ResNet-50 and ConvNext-T.
It seems the arguments can not be delivered to the student model.
Please tell me the right way to run the distillation code.
Thank you!
Number of the class = 1000
Sampler_train = <torch.utils.data.distributed.DistributedSampler object at 0x7acd9c070610>
Mixup is activated!
model SLaK_tiny Decom True
Traceback (most recent call last):
File "main_KD.py", line 764, in
main(args)
File "main_KD.py", line 453, in main
model_convnext=resnet50(args=args)
TypeError: resnet50() got an unexpected keyword argument 'args'
Traceback (most recent call last):
File "main_KD.py", line 764, in
main(args)
File "main_KD.py", line 453, in main
model_convnext=resnet50(args=args)
TypeError: resnet50() got an unexpected keyword argument 'args'
Number of the class = 1000
Sampler_train = <torch.utils.data.distributed.DistributedSampler object at 0x76082d66b790>
Mixup is activated!
model SLaK_tiny Decom True
Traceback (most recent call last):
File "main_KD.py", line 764, in
main(args)
File "main_KD.py", line 498, in main
model_convnext = create_model(
File "/home/user/anaconda3/envs/slak/lib/python3.8/site-packages/timm/models/factory.py", line 74, in create_model
model = create_fn(pretrained=pretrained, **kwargs)
File "/home/user/road/SLaK/models/SLaK.py", line 267, in SLaK_tiny
model = SLaK(depths=[3, 3, 9, 3], dims=[96, 192, 384, 768], **kwargs)
TypeError: init() got an unexpected keyword argument 'args'