This repository provides the official implementation of "CrossMPT: Cross-attention message-passing transformer for error correcting codes" (ICLR 2025).
Paper link: https://openreview.net/forum?id=gFvRRCnQvX
(a) BER performance of various decoders (BP, Hyp BP, AR BP, ECCT) and CrossMPT for (31, 16) BCH code (
- Pytorch
Codes for training CrossMPT on GPU 0, 6 decoder layers, dimension 128 on (121, 60) LDPC code and (31,16) BCH code
python Main_CrossMPT.py --gpu=0 --N_dec=6 --d_model=128 --code_type=LDPC --code_n=121--code_k=60
python Main_CrossMPT.py --gpu=0 --N_dec=6 --d_model=128 --code_type=BCH --code_n=31--code_k=16 --standardize--epochs number of epoch
--batch_size batch size
--code_type code type
--code_k codeword dimension k
--code_n codeword length n
--N_dec number of decoder layers N
--d_model embedding vector dimension d
@inproceedings{
Park2025crossmpt,
title={Cross{MPT}: Cross-attention Message-passing Transformer for Error Correcting Codes},
author={Seong-Joon Park and Hee-Youl Kwak and Sang-Hyo Kim and Yongjune Kim and Jong-Seon No},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=gFvRRCnQvX}
}
Codes are available only for non-commercial research purposes.
This repository is based on ECCT.