Skip to content
/ FCIS Public
forked from zhangye-zoe/FCIS

Four Color Theorem for Cell Instance Segmentation, ICML.

License

Notifications You must be signed in to change notification settings

MMV-Lab/FCIS

 
 

Repository files navigation

The Four Color Theorem for Cell Instance Segmentation

GitHub Stars GitHub Forks License

This is the official code repository for our paper accepted at ICML 2025: The Four Color Theorem for Cell Instance Segmentation

  • Paper Link: Read the Paper
  • Publication: International Conference on Machine Learning (ICML) 2025
  • Authors: Ye Zhang, Yu Zhou, Yifeng Wang, Jun Xiao, Ziyue Wang, Yongbing Zhang, Jianxu Chen

🌟 Highlights

  • Implementation of a novel approach applying graph coloring principles (inspired by the Four Color Theorem) for consistent cell instance visualization or downstream tasks.
  • Integration with popular computer vision frameworks (mmsegmentation, Detectron2) for robust instance segmentation and training pipelines.
  • Support for processing various biomedical image datasets (BBBC006, DSB2018, PanNuke, Yeaz).

🚀 Installation

This project requires Python 3.7 and was developed with specific versions of PyTorch and MMCV.

1. Create a Conda environment:

conda create -n fcis python=3.7 -y
conda activate fcis

2. Install PyTorch:

Ensure you install the version compatible with your CUDA version. The original development used CUDA 11.1.

# For CUDA 11.1
pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 torchaudio==0.9.1 -f https://download.pytorch.org/whl/torch_stable.html

3. Install MMCV-full:

Installing mmcv-full with CUDA support is crucial for leveraging GPU acceleration with the OpenMMLab frameworks.

pip install mmcv-full==1.3.13 -f https://download.openmmlab.com/mmcv/dist/cu111/torch1.9/index.html

4. Install Required Packages:

Install the remaining dependencies listed in the requirements.txt file.

pip install -r requirements.txt

5. Clone and Install this repository:

git clone https://github.com/zhangye-zoe/FCIS.git
cd FCIS
pip install -e .

📊 Dataset Preparation

This project utilizes data from the following publicly available datasets:

Preprocessing is required to convert raw datasets into the format expected by our training and evaluation pipelines.

For detailed instructions on how to download and preprocess the data, please refer to:

📄 Data Preparation Guide

This guide includes information on the required data structure and the scripts/notebooks located in the ./preprocessing/ directory.

💥 A frequently asked question

Difference between inst/ and fcis_inst/

  • inst/*.png: binary encoding (0: background, 1: foreground)
  • fcis_inst/*.png: four-color >encoding (0–4),
  • inst/*.npy and fcis_inst/*.npy: instance ID maps (0–N)

Note: The *.npy files in both folders are identical;
the difference lies only in the *.png encodings.

🚀 Quick Start: Inference & Visualization

Download the pre-trained model weights from: 👉 Hugging Face – FCIS Weights

To quickly evaluate the trained model and visualize the prediction results, run the following command:

python tools/test.py \
configs/FCIS/fcis_bbbc.py \
./work_dirs/fcis_bbbc/fcis_bbbc.pth \
--show
--show-fold ./z_visual/BBBC

🧪 Training and Inference

python tools/train.py \
configs/FCIS/fcis_bbbc.py
python tools/test.py \
configs/FCIS/fcis_bbbc.py \
./work_dirs/fcis_bbbc/latest.pth \

📖 Citation

If you find our work useful, please cite our paper:

@inproceedings{zhang2025fourcolor,
  title={The Four Color Theorem For Cell Instance Segmentation},
  author={Zhang, Ye and Zhou, Yu and Wang, Yifeng and Xiao, Jun and Wang, Ziyue and Zhang, Yongbing and Chen, Jianxu},
  booktitle={Proceedings of the 42nd International Conference on Machine Learning},
  series={Proceedings of Machine Learning Research},
  volume={267},
  pages={77194--77215},
  year={2025},
  publisher={PMLR}
}

About

Four Color Theorem for Cell Instance Segmentation, ICML.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 70.1%
  • Jupyter Notebook 29.8%
  • Shell 0.1%