Skip to content

Regina921/EFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EFormer: An Effective Edge-based Transformer for Vehicle Routing Problems

The PyTorch Implementation of IJCAI 2025 -- EFormer: An Effective Edge-based Transformer for Vehicle Routing Problems

EFormer, an Edge-based Transformer model that uses edge as the sole input for VRPs. Our approach employs a precoder module with a mixed-score attention mechanism to convert edge information into temporary node embeddings. We also present a parallel encoding strategy characterized by a graph encoder and a node encoder, each responsible for processing graph and node embeddings in distinct feature spaces, respectively. This design yields a more comprehensive representation of the global relationships among edges. In the decoding phase, parallel context embedding and multi-query integration are used to compute separate attention mechanisms over the two encoded embeddings, facilitating efficient path construction. We train EFormer using reinforcement learning in an autoregressive manner.

Overview

image

Download datasets and models

Download datasets and models from Hugging Face.

Unzip TSP-results.zip and CVRP-results.zip, and organize the files in the project directory as follows:

EFormer
├─ TSP
│  ├─ data
│  └─ results
└─ CVRP
   ├─ data
   └─ results

Dependencies

Python >= 3.8
Pytorch >= 2.0.1
numpy==1.24.4
matplotlib==3.5.2 
tqdm==4.67.1

Citation

If this repository is helpful for your research, please cite our paper:

 @inproceedings{ijcai2025p954,
  title     = {EFormer: An Effective Edge-based Transformer for Vehicle Routing Problems},
  author    = {Meng, Dian and Cao, Zhiguang and Wu, Yaoxin and Hou, Yaqing and Ge, Hongwei and Zhang, Qiang},
  booktitle = {Proceedings of the Thirty-Fourth International Joint Conference on
               Artificial Intelligence, {IJCAI-25}},
  publisher = {International Joint Conferences on Artificial Intelligence Organization},
  pages     = {8582--8590},
  year      = {2025},
  month     = {8},
  doi       = {10.24963/ijcai.2025/954},
  url       = {https://doi.org/10.24963/ijcai.2025/954},
}

Acknowledgments

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages