Skip to content

DyNA is a framework for dynamic, data-driven nonlinear signal propagation, inspired by biological neural networks.

License

Notifications You must be signed in to change notification settings

Sombressoul/dyna

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

992 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Important note

This repository contains the work‑in‑progress research and implementation of the biologically inspired signal propagation framework DyNA (Dynamic Neural Architecture).


Philosophy

DyNA explores alternatives to conventional sequence modelling with the guiding principle:

Any linear or non‑linear transformation should be derived from the data semantics itself.

The library focuses on dynamic weight generation, stable signal compression and gradient control techniques.


Project structure

docs/             # Various documentation about DyNA project
dyna/
├── functional/   # Stand‑alone differentiable functions
├── lib/          # Low‑level building blocks
└── module/       # High‑level neural network layers

Implemented components

dyna.functional

  • backward_gradient_normalization – normalizes gradients during the backward pass to prevent exploding or vanishing updates. Implements the method described in the paper “Backward Gradient Normalization in Deep Neural Networks” by Alejandro Cabana and Luis F. Lago‑Fernández (arXiv:2106.09475).
  • log_proportional_error – computes a logarithmic error term with custom gradients, useful near zero.
  • noisein / noiseover – injects element-wise or global noise for regularization.
  • siglog and siglog_parametric – signed logarithmic mappings with custom gradient shaping.

dyna.lib

  • TensorComposerDelta – generates banks of 2‑D weights via rank‑weighted modulation and diversity penalties.
  • TensorComposerMobius – constructs spatial filters using Mobius‑like complex transformations and learned projections.

dyna.module

  • DynamicConv2DDelta – convolutional layer that draws its kernels from TensorComposerDelta conditioned on context vectors.
  • DynamicConv2DMobius – convolutional layer based on TensorComposerMobius with optional dynamic bias, padding and offsets.
  • SignalStabilizationCompressor – non‑linear block combining gating, logarithmic compression and inverse RMS scaling for stable activations.

Documentation

Holographic Projection Memory (HPM)

All documentation and theoretical content related to Holographic Projection Memory (HPM) has been moved to the dedicated repository: CogniRay

Please refer to that project for up-to-date material, licensing, and development.


TensorComposerMobius (TCM)

TensorComposerMobius (TCM) is a dynamic composition module that generates high-dimensional tensor structures through complex-phase transformations over learnable subspaces. By leveraging Möbius-like modulation, spectral routing, and adaptive rank mixing, TCM supports self-modifying representations and nonlinear composition - enabling compositional generalization, runtime reconfiguration, and modular control in DyNA-style cognitive architectures.

Theory Implementation Verified Test code
Status TODO COMPLETED YES YES

Compact Spectral Multiplier (CSM)

Compact Spectral Multiplier (CSM) is a randomized kernel estimator that approximates high-dimensional multilinear interactions via CountSketch and FFT. It enables fast, low-memory inner product estimation between structured tensors while preserving unbiasedness. CSM is particularly effective in highly correlated regimes.

Theory Implementation Verified Test code
Status COMPLETED COMPLETED YES YES

About

DyNA is a framework for dynamic, data-driven nonlinear signal propagation, inspired by biological neural networks.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages