Automatic differentiation engine in Haskell.
After cloning the project, to run the tests simply execute
cabal build
cabal run
To implement your own custom operators, you need to create an instance of the Operator typeclass. However, if you're just working with scalar-valued functions (ScalarOp should be enough.
References:
- https://pytorch.org/tutorials/beginner/introyt/autogradyt_tutorial.html
- https://github.com/karpathy/micrograd
Since in Haskell there is no such thing as a "reference to an object", my solution to accumulate the gradients when a node is a node to more than one parent in the computational graph is to traverse the graph and fill a map of
I started this project mostly to learn the basics of programming in Haskell. This library is a toy project and has no pretensions of performance or correctness.