Skip to content

register_backward_hook deprecated #10

@suraj-srinivas

Description

@suraj-srinivas

Hello, first of all, I love this implementation and it has been working wonderfully for me! But I noticed that pytorch recently started throwing a deprecation warning for the usage of hooks on line 28 of saliency.tensor_extractor: handle_g = m.register_backward_hook(self._extract_layer_grads)

The warning is:

UserWarning: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.
warnings.warn("Using a non-full backward hook when the forward contains multiple autograd Nodes "

I attempted to fix the issue myself but wasn't able; I don't understand the difference between register_backward_hook and register_full_backward_hook.

Would this be an issue you see worth solving? It currently works for me so there is no immediate demand, and I can always specify that my code will work with a specific version of pytorch, but I love fullgrad so much that I would hate for it to be unable to move forward with pytorch.

Originally posted by @nephina in #7 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions