Skip to content

DanielGabai/attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

attention

Building out an attention block in rtl

starting assumptions: batch size of 1 sequence length of 4 dimension = 8 number system: int8

going for simplicity first

About

Building out an attention block in rtl

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published