Hi guys,
I have observed that some of the custom operators are only registered in the "derivatives.yaml" file, such as the operator "npu_binary_cross_entropy_with_logits_backward"; Some operators are only registered in the ”op_plugin_functions.yaml“ file, such as the operator "npu_prompt_flash_attention". Moreover, some operators are registered in both, such as "npu_multi_head_attention". So, how to determine which file a custom operator should be registered in ?
From what I understand, the operator registered to the "derivatives.yaml" file, the backpropagation of the operator should be different from the official PyTorch, then it should be registered in both. Did I misunderstand something?