Skip to content

Diabolocom-Research/ULD-Loss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

ULD-Loss

Towards Cross-Tokenizer Distillation: The Universal Logit Distillation Loss for LLMs

Paper accepted at TMLR (01/25) - ArXiv Link

This repository contains the implementation of ULD-Loss. For detailed documentation, please refer to each submodule.

Installation

To initialize the repository, run the following commands:

git clone --recursive https://github.com/Diabolocom-Research/ULD-Loss
cd ULD-Loss
git submodule update --init --recursive

Citation

If you use ULD-Loss in your research, please cite our paper:

@misc{boizard2025crosstokenizerdistillationuniversallogit,
      title={Towards Cross-Tokenizer Distillation: The Universal Logit Distillation Loss for LLMs},
      author={Nicolas Boizard and Kevin El Haddad and Céline Hudelot and Pierre Colombo},
      year={2025},
      eprint={2402.12030},
      archivePrefix={arXiv},
      journal={TMLR},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2402.12030},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published