Skip to content

hvini/FansGoBrrr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

4 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿง  ANN from Scratch in C (CPU โ†’ GPU)

๐Ÿšง Phase 1: CPU Implementation

  • ๐Ÿ”ง Build System โ€“ Set up Makefile or CMake
  • ๐Ÿงฑ Data Structures โ€“ Define Matrix, Layer, Network
  • โž— Matrix Operations โ€“ Add, multiply, transpose
  • ๐Ÿ” Activation Functions โ€“ Sigmoid, ReLU, Softmax
  • ๐Ÿ”„ Forward Propagation โ€“ Weighted sums + activations
  • ๐Ÿ“‰ Loss Function โ€“ MSE or Cross-Entropy
  • ๐Ÿงฎ Backpropagation โ€“ Calculate gradients
  • ๐Ÿ‹๏ธ Weight Updates โ€“ Apply SGD
  • ๐ŸŒ€ Training Loop โ€“ Run for N epochs
  • โœ… Test Cases (CPU)
    • Matrix operations match expected results
    • Activation outputs correct for sample inputs
    • Forward propagation output sanity check
    • Backpropagation gradients validated numerically
    • Training on XOR dataset: loss decreases, correct outputs

๐Ÿš€ Phase 2: GPU Acceleration (CUDA)

  • โš™๏ธ CUDA Setup โ€“ Configure build and test kernel
  • ๐ŸงŠ Matrix Ops (GPU) โ€“ Port add, multiply, etc.
  • ๐ŸŒ Activations (GPU) โ€“ Parallel element-wise ops
  • ๐Ÿงฌ Forward Prop (GPU) โ€“ Matrix ops + activations
  • ๐Ÿ”™ Backprop (GPU) โ€“ Gradient calculation on GPU
  • ๐Ÿ Training Loop (GPU) โ€“ Fully GPU-accelerated
  • ๐Ÿ” Test Cases (GPU)
    • GPU matrix ops produce identical results to CPU
    • Activation functions match CPU outputs
    • Forward propagation matches CPU outputs
    • Backpropagation gradients match CPU calculations
    • Training on XOR dataset converges with GPU

About

Artificial Neural Network implementation from scratch with GPU acceleration

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published