Skip to content

Latest commit

 

History

History
13 lines (7 loc) · 643 Bytes

File metadata and controls

13 lines (7 loc) · 643 Bytes

Pytorch custom activation function

Implemention and guide on the making of a Pytorch custom activation function with autodifferentation, c++ and cuda bindings.

To see the process with references about how to make your own function and classes on Pytorch and how to import the one on this repository go to the Exploratory notebook.

Result:

Comparison

Credits

The original idea of the activation function is from Javiabellan. For further discussion you can see this thread on Twitter.