Skip to content

The official implementation of "RSRWKV: A Linear-Complexity 2D Attention Mechanism for Efficient Remote Sensing Vision Task" (https://arxiv.org/abs/2503.20382).

Notifications You must be signed in to change notification settings

Ling-yunchi/RSRWKV

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RSRWKV

The official implementation of "RSRWKV: A Linear-Complexity 2D Attention Mechanism for Efficient Remote Sensing Vision Task".

News

  • 2025/09/23: We release the code and models of RSRWKV.

Overview

overall_arch

RSRWKV is a novel adaptation of the RWKV architecture for high-resolution remote sensing analysis.

It introduces a Linear-Complexity 2D Attention Mechanism through the 2D-WKV scanning strategy, enabling efficient isotropic context aggregation. The framework integrates:

  • 2D-WKV: bridges sequential processing with spatial reasoning.
  • MVC-Shift Module: enhances multiscale receptive field coverage.
  • Efficient Channel Attention (ECA): improves cross-channel interaction and semantic saliency.

Experiments on NWPU RESISC45, VHR-10 v2, SSDD, and GLHWater demonstrate superior performance over CNN and Transformer baselines in classification, detection, and segmentation.

Models

All model weights and logs are available at Baidu Drive.

Citation

If this work is helpful for your research, please consider citing the following BibTeX entry.

@article{li2025rsrwkv,
  title={RSRWKV: A Linear-Complexity 2D Attention Mechanism for Efficient Remote Sensing Vision Task},
  author={Li, Chunshan and Wang, Rong and Yang, Xiaofei and Chu, Dianhui},
  journal={arXiv preprint arXiv:2503.20382},
  year={2025}
}

About

The official implementation of "RSRWKV: A Linear-Complexity 2D Attention Mechanism for Efficient Remote Sensing Vision Task" (https://arxiv.org/abs/2503.20382).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published