Skip to content

mrochk/Autograd.hs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autograd.hs

Automatic differentiation engine in Haskell.

After cloning the project, to run the tests simply execute

cabal build
cabal run

To implement your own custom operators, you need to create an instance of the Operator typeclass. However, if you're just working with scalar-valued functions ($f : \mathbb{R}^d \to \mathbb{R}$), extending ScalarOp should be enough.

References:

Since in Haskell there is no such thing as a "reference to an object", my solution to accumulate the gradients when a node is a node to more than one parent in the computational graph is to traverse the graph and fill a map of $\text{Id} \rightarrow \text{Gradient}$, one drawback of this method is that each node identifier must be unique. I.e, we do not perform a topological sort such as in Micrograd.

I started this project mostly to learn the basics of programming in Haskell. This library is a toy project and has no pretensions of performance or correctness.

About

Automatic differentiation engine in Haskell.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published