This repository contains the S/W framework used for all the experiments in the below paper. '[Jisoo Kim, Sungmin Kang, and Sunwoo Lee, Layer-wise Update Recycling for Communication-Efficient Federated Learning]'
- tensorflow2 (< 2.16.0)
- tensorflow_datasets
- python3
- mpi4py
- tqdm
- Set hyper-parameters properly in
config.py. - Put the dataset files in the top directory of this program. The directory name should be the same as
datasetin config.py. - Run training.
mpirun -n 8 python main.py
This program evaluates the trained model after every epoch and then outputs the results as follows.
loss.txt: An output file that contains the training loss for every epoch.acc.txt: An output file that contains the validation accuracy for every epoch../checkpoint: The checkpoint files generated after every epoch. This directory is created only whencheckpointis set to 1 inconfig.py.
We will provide a few key experimental results here once the papers are published.
- FedAvg
- FedLUAR
- CIFAR-10
- CIFAR-100
- FEMNIST
- AG News
- Jisoo Kim (starprin3@gmail.com)
- Sunwoo Lee (sunwool@inha.ac.kr)