Traceback (most recent call last):
File "scripts/generate.py", line 99, in <module>
generate_TLDRs(**vars(args))
File "scripts/generate.py", line 17, in generate_TLDRs
task='translation'
File "/research/home/maxlitster/scitldr/repo/fairseq/models/bart/model.py", line 136, in from_pretrained
**kwargs,
File "/research/home/maxlitster/scitldr/repo/fairseq/hub_utils.py", line 75, in from_pretrained
arg_overrides=kwargs,
File "/research/home/maxlitster/scitldr/repo/fairseq/checkpoint_utils.py", line 339, in load_model_ensemble_and_task
state = load_checkpoint_to_cpu(filename, arg_overrides)
File "/research/home/maxlitster/scitldr/repo/fairseq/checkpoint_utils.py", line 263, in load_checkpoint_to_cpu
state = torch.load(f, map_location=torch.device("cpu"))
File "/home/maxlitster/miniconda3/envs/tldr/lib/python3.7/site-packages/torch/serialization.py", line 529, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "/home/maxlitster/miniconda3/envs/tldr/lib/python3.7/site-packages/torch/serialization.py", line 692, in _legacy_load
magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, '\x0a'.
I'm curious if anyone else has any advice as to how to generate text with the pretrained weights. Thanks!
I renamed the
bart.tldr-ao.ptmodel tocheckpoint_best.ptand tried running the generation script aspython scripts/generate.py model/ SciTLDR-Data/SciTLDR-A/ctrl ./ --beam 2 --lenpen 0.4 --test_fname test.hypoas shown in the github instructors, but meet the following error:I'm curious if anyone else has any advice as to how to generate text with the pretrained weights. Thanks!