Analog AI — physical neural networks that compute with physics
Inspired by Liu Cixin's Dark Forest hypothesis
We build AI that runs on physics, not software.
Traditional AI requires massive digital infrastructure — GPUs, data centers, gigawatts of power. We're taking a different approach: analog neural networks that perform inference directly in the physical substrate, with no clock, no ADC, and no Von Neumann bottleneck.
The result? AI systems that operate at microwatt scales while remaining invisible in the noise floor.
| Technology | What It Does |
|---|---|
| Analog Neural Networks | Inference in physics — continuous-time, massively parallel |
| Magnetic Bubble Memory | Non-volatile compute substrates with femtojoule operations |
| FPAA Integration | Field-programmable analog arrays for reconfigurable analog compute |
| Ultra-Low Power | Microwatt-scale persistent inference |
Digital AI hits fundamental limits:
- Power wall — GPUs burn kilowatts for inference
- Memory wall — data movement costs more than computation
- Latency wall — clock cycles add up
Analog AI sidesteps all three. Our networks compute in continuous time, where "memory" and "processing" are the same physical phenomenon.
|
Seeing through walls with WiFi. Converting ambient RF into spatial awareness — passive sensing that reveals what cameras can't. |
Exploring how organisms encode morphological memory in bioelectric fields. Understanding the computational substrate of regeneration. |
"The universe is a dark forest. Every civilization is an armed hunter stalking through the trees... trying to tread without sound."
— Liu Cixin, The Dark Forest
In Liu Cixin's universe, detection means destruction. The same principle applies to sensing: the best sensor is one that cannot be detected. Our analog systems operate below the noise floor — perceiving without revealing.