Skip to content

CVMILab-CUK/synapse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

SYNAPSE: Synergizing an Adapter and Finetuning for High-Fidelity EEG Synthesis from a CLIP-Aligned Encoder

main_model SYNAPSE is an efficient two-stage framework for high-fidelity, multi-subject EEG-to-Image synthesis, which uses a pre-trained, CLIP-aligned autoencoder to condition Stable Diffusion via a lightweight adaptation module.

Highlights

  • State-of-the-Art (SOTA) FID: Achieves a SOTA FID score of 46.91 in the challenging multi-subject CVPR40 setting, a nearly 2x improvement over the previous SOTA (GWIT, 80.47).

  • High Efficiency: Uses the fewest total trainable parameters (152.69M) compared to all recent baselines (DreamDiffusion, 210M; BrainVis, 195M; GWIT, 368M), enabling the entire pipeline to be trained on a single consumer GPU (RTX 3090).

  • Direct Alignment Framework: Proposes a novel hybrid autoencoder that is pre-trained to directly align EEG signals with the CLIP embedding space , eliminating the need for complex, indirect classification or separate mapping networks used in prior work.

Main Results

Compare with Others

result1

Semantcial Results

result2

Baseline Code : LINK



Run SYNAPSE Framework

Building Enviroment

conda create --name=synapse python=3.10
conda activate synapse
pip install -r requirements.txt

Get Started


  1. Dataset Download LINK
  2. And Run preprocessing.ipynb files : LINK

Pretrained Models


Pretrian Ecnoder: LINK

Ptretrain LDM : Multi-Subject, Subject-4

Gen Images

python gen_images.py

Test Output

Run MAKE IS Dataset.ipynb : LINK

Run test_images.py

python test_images.py

Run test_IS.py

python test_IS.py

From Scratch

We support only ddp modes now(because of stability of codes)

Stage1

  1. set the config, named Train_AE.jsonLINK

  2. run from scratch

train_ae.py

Stage2

  1. set the config, named Train_LDM.json LINK

  2. run from scratch

train_ldm.py

Citation

Not ready yet

Acknowledgement

Star History

Star History Chart

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published