Skip to content

ZichenMiao/Coeff_Tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Coeff-Tuning: A Graph Filter Subspace View for Tuning Attention-Based Large Models

This is the official implementation of the paper "Coeff-Tuning: A Graph Filter Subspace View for Tuning Attention-Based Large Models".

Coeff-Tuning

alt text

Coeff-Tuning introduces a novel parameter-efficient fine-tuning approach that views attention mechanisms through the lens of graph signal processing. The key insights are:

  • Models attention as a graph convolutional filter operating on node features in a graph structure
  • Identifies that multi-head attention maps can be viewed as a subspace of the graph convolutional filters.
  • Proposes to tune only the subspace coefficients of the graph convolutional filters rather than all parameters.

Usage

Toy Example

Reproduce the toy example results in the paper, which shows the effectiveness of Coeff-Tuning stems in breaking the convex combination of the attention maps and enhancing the expressiveness of the attention.

alt text

python toy_example.py

Tuning Vision Transformer for few-shot classification

Please check visual_classification for details.

Citation

If you find this work useful in your research, please consider citing:

@inproceedings{miao2025coeff,
  title={Coeff-Tuning: A Graph Filter Subspace View for Tuning Attention-Based Large Models},
  author={Miao, Zichen and Chen, Wei and Qiu, Qiang},
  booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
  pages={20146--20157},
  year={2025}
}

Acknowledgement

We thank SSF, FacT, and DoRA for their amazing works.

About

[CVPR 2025 Highlight] Code for the paper "Coeff-Tuning: A Graph Filter Subspace View for Tuning Attention-Based Large Models"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors