Enhancing Spiking Transformers with Binary Attention Mechanisms
Published in Tiny Papers @ ICLR 2024, 2024
Summary:
This paper revisits attention in Spiking Transformers and proposes a binary attention mechanism to make attention computation more spike-friendly. The method reduces computational overhead and promotes sparsity while maintaining competitive performance on both image and neuromorphic benchmarks.
Bibtex:
@inproceedings{shen2024binaryattention,
title={Enhancing Spiking Transformers with Binary Attention Mechanisms},
author={Guobin Shen and Dongcheng Zhao and Sicheng Shen and Yi Zeng},
booktitle={The Second Tiny Papers Track at ICLR 2024},
year={2024},
url={https://openreview.net/forum?id=6X3TNqLb5t}
}
Recommended citation: Shen, G., Zhao, D., Shen, S., & Zeng, Y. (2024). Enhancing Spiking Transformers with Binary Attention Mechanisms.
Download Paper
