Pyrception is a simulation framework for bio-plausible simulation of perceptual modalities. Currently, it supports visual pathways of the mammalian retina, but the long-term goal is to support modalities such as auditory, olfactory and so forth. It can also serve as an input conversion library for encoding raw multimodal sensory input into a uniform spike train suitable for processing with spiking neural networks.
You can install Pyrception from PyPI:
pip install pyrceptionor directly from GitHub (optionally in development mode):
git clone git@github.com:cantordust/pyrception.git
cd pyrception
pip install -e .Please refer to the documentation, which contains a step-by-step notebook demonstrating how to use pyrception with a static image. More notebooks are currently being developed, including frame-based RGB input and sparse event input from an event camera. Watch this space.
To generate the documentation, run the MkDocs build pipeline. Note that to build and view the documentation locally, you have to install pyrception from GitHub with the optional docs modifier:
pip install -e .[dev]
cd docs
mkdocs buildThen, to view the documentation locally, start the MkDocs server:
mkdocs serve- All major types of retinal cells.
- Receptors (raw input, Weber's law).
- Horizontal cells (mean local brightness, normalising feedback).
- Bipolar cells (positive and negative contrast, temporal filter, excitatory input to ganglion cells).
- Amacrine cells (inhibitory input to ganglion cells, modulatory signal to bipolar cells).
- Ganglion cells (spiking).
- Logpolar kernel arrangement.
- Uniform or Gaussian kernels.
- Arbitrary kernel, size, shape and orientation.
- Saccadic movements [WIP].
- Colour vision (with colour opponency) [WIP].
- Temporal dynamics [WIP].
- Events as input [WIP].
WIP.
WIP.