-
Notifications
You must be signed in to change notification settings - Fork 58
Description
Hi author, thanks for your great code. When I run your example code in your README, an error happened. Can you check your code again?
Loading checkpoint shards: 100%|██████████████████| 2/2 [00:01<00:00, 1.63it/s]
Traceback (most recent call last):
File "/home/vsw/Desktop/RADIO/main.py", line 63, in
model = torch.hub.load('NVlabs/RADIO', 'radio_model', version=model_version, progress=True, skip_validation=True, adaptor_names=['siglip2-g'])
File "/home/vsw/miniconda3/envs/radio/lib/python3.10/site-packages/torch/hub.py", line 566, in load
model = _load_local(repo_or_dir, model, *args, **kwargs)
File "/home/vsw/miniconda3/envs/radio/lib/python3.10/site-packages/torch/hub.py", line 595, in _load_local
model = entry(*args, **kwargs)
File "/home/vsw/.cache/torch/hub/NVlabs_RADIO_main/hubconf.py", line 143, in radio_model
adaptor = adaptor_registry.create_adaptor(ttype, chk["args"], tconf, adaptor_state)
File "/home/vsw/.cache/torch/hub/NVlabs_RADIO_main/radio/adaptor_registry.py", line 34, in create_adaptor
return self._registry[name](main_config, adaptor_config, state)
File "/home/vsw/.cache/torch/hub/NVlabs_RADIO_main/radio/siglip2_adaptor.py", line 96, in create_siglip2_adaptor
return SigLIP2Adaptor(main_config, adaptor_config, state)
File "/home/vsw/.cache/torch/hub/NVlabs_RADIO_main/radio/siglip2_adaptor.py", line 37, in init
model = AutoModel.from_pretrained(version, trust_remote_code=True)
File "/home/vsw/miniconda3/envs/radio/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
return model_class.from_pretrained(
File "/home/vsw/miniconda3/envs/radio/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3677, in from_pretrained
) = cls._load_pretrained_model(
File "/home/vsw/miniconda3/envs/radio/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4155, in _load_pretrained_model
raise RuntimeError(f"Error(s) in loading state_dict for {model.class.name}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for SiglipModel:
size mismatch for text_model.head.weight: copying a param with shape torch.Size([1536, 1152]) from checkpoint, the shape in current model is torch.Size([1152, 1152]).
size mismatch for text_model.head.bias: copying a param with shape torch.Size([1536]) from checkpoint, the shape in current model is torch.Size([1152]).
You may consider adding ignore_mismatched_sizes=True in the model from_pretrained method.