Skip to content

Hyperparameter Settings for KD on Imagenet #53

@Calmepro777

Description

@Calmepro777

To reproduce the baseline result on my machine (kd from rn34 to rn18), I would like to know the hyperparameter settings for knowledge distillation on Imagenet.
Especially the weights for cross entropy loss and KLDiv loss, temperature, and batch_size.
Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions