Skip to content

kaixih/layer_norm

Repository files navigation

Layer Normalization

This repo implements the Layer Normalization with different programming languages or libraries: Tensorflow (Keras), Numpy, C++, Eigen, CUDA. Each sample includes both the forward and backward passes and results are checked compared with Keras's LayerNormalization layer.

Layer Normalization Forward Pass

Suppose are batch and feature dimensions respectively.


Layer Normalization Backward Pass


Here we can view as a composition function , where and , are defined as above. Therefore, we get:

  • Compute

  • Compute

  • Compute

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published