Skip to content

[TASK] Add Layernorm Ball Unit #9

@shirohasuki

Description

@shirohasuki

Deliverables

  • Add a LayerNorm RTL implementation in the prototype lib (under the arch path).
  • A Pull Request (PR) containing a test written in C for the LayerNorm operation and a README to introduce your design.
  • Report the performance results in this issue.

Task Description

  • LayerNorm (Layer Normalization) is a normalization technique that normalizes inputs across the feature dimension for each training example, computing the mean and variance used for normalization from all summed inputs to the neurons in a layer. It has become a fundamental component in modern deep learning architectures, particularly in transformer models like BERT, where it stabilizes training and improves convergence.
  • Hardware implementation of LayerNorm primarily falls into two categories: lookup table and approximation methods. Lookup table approaches store pre-computed reciprocal square root values in memory for the normalization factor calculation, offering predictable latency and good accuracy but requiring significant memory resources. Approximation methods use mathematical functions to approximate the reciprocal square root operation and other complex computations, utilizing arithmetic operations while requiring minimal memory storage.
  • For this implementation, we will adopt the approximation method to approximate the LayerNorm function
  • Please refer to the previous Pull Request (Completed the development of ReluBall and further improved the operation manual #6) for the detailed implementation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions