Paper accepted at TMLR (01/25) - ArXiv Link
This repository contains the implementation of ULD-Loss. For detailed documentation, please refer to each submodule.
To initialize the repository, run the following commands:
git clone --recursive https://github.com/Diabolocom-Research/ULD-Loss
cd ULD-Loss
git submodule update --init --recursiveIf you use ULD-Loss in your research, please cite our paper:
@misc{boizard2025crosstokenizerdistillationuniversallogit,
title={Towards Cross-Tokenizer Distillation: The Universal Logit Distillation Loss for LLMs},
author={Nicolas Boizard and Kevin El Haddad and Céline Hudelot and Pierre Colombo},
year={2025},
eprint={2402.12030},
archivePrefix={arXiv},
journal={TMLR},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2402.12030},
}