A slurm friendly MEEG derivative extraction package leveraging bids-like data organization and DAG processing.
- Agnostic to data organization
- Slurm friendly
- Reusage of existing derivatives through DAG processing.
- Yaml-based configuration
- Extensible without needing to fork the repository
Let's say you are exploring
# Clone the repository
pip install -e .[dev,test,docs]
# Run quality checks
ruff check src/ (fix with `ruff check src/ --fix` if needed)
black --check . (fix with `black .` if needed)
pytest -q
# To debug pytest, use:
pytest -q --pdb
pytest -s -q --no-cov --pdb
# Build docs
sphinx-build -b html docs docs/_build/html -W --keep-going
# Clean docs
sphinx-build -M clean docs docs/_build/html
or
rm -rf docs/_build
If you get an error like:
File "src/netCDF4/_netCDF4.pyx", line 5645, in netCDF4._netCDF4.Variable.__setitem__
File "src/netCDF4/_netCDF4.pyx", line 5961, in netCDF4._netCDF4.Variable._put
File "src/netCDF4/_netCDF4.pyx", line 2160, in netCDF4._netCDF4._ensure_nc_success
RuntimeError: NetCDF: HDF errorYou may need to install hdf5 in your system and built from source:
pip install --no-binary=h5py h5pyPipelines can import additional node definitions before registering derivatives by pointing new_definitions to one or more Python files:
datasets: example_pipelines/datasets_epilepsy.yml
mount_point: local
new_definitions:
- custom_nodes/artifacts.py
- /abs/path/to/local_nodes.py
DerivativeDefinitions:
MyCustomDerivative:
nodes:
- id: 0
node: my_custom_node # registered inside the imported modulesRelative paths are resolved from the pipeline YAML location. Each module is executed once and may call @register_node as part of its import.
iterate_derivative_pipeline can fan out across files using joblib. You can enable it either by passing n_jobs (and optionally joblib_backend / joblib_prefer) when calling the orchestrator or by adding the keys to your pipeline YAML:
n_jobs: 4 # -1 to use all cores, 1 or null keeps it serial
joblib_backend: loky
joblib_prefer: processesThe CLI mirrors these options via --n-jobs, --joblib-backend, and --joblib-prefer.
You can visualize .fif or .nc files using the built-in visualization tool:
python -m neurodags.visualization path/to/your_file.fifSee CONTRIBUTING.md.
MIT. See LICENSE.