-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
if args.restore_ckpt is not None:
strStep = os.path.split(args.restore_ckpt)[-1].split('_')[0]
total_steps = int(strStep) if strStep.isdigit() else 0
else:
total_steps = 0
Hi,
I found this code in train.py. The class OneCycleLR need a paramter 'last_epoch' to resume the lr, so it needs save the optimizer's state_dict, because we need a key called 'initial_lr'. However, I don't find the code to save optimizer's dict. Why?
Do you have to rename the model file when you load it? This would allow this code to be skipped.
Metadata
Metadata
Assignees
Labels
No labels