Conference item icon

Conference item

Linear complexity self-attention with 3rd order polynomials

Abstract:
Self-attention mechanisms and non-local blocks have become crucial building blocks for state-of-the-art neural architectures thanks to their unparalleled ability in capturing long-range dependencies in the input. However their cost is quadratic with the number of spatial positions hence making their use impractical in many real case applications. In this work, we analyze these methods through a polynomial lens, and we show that self-attention can be seen as a special case of a 3rd order polynomial. Within this polynomial framework, we are able to design polynomial operators capable of accessing the same data pattern of non-local and self-attention blocks while reducing the complexity from quadratic to linear. As a result, we propose two modules (Poly-NL and Poly-SA) that can be used as ”drop-in” replacements for more-complex non-local and self-attention layers in state-of-the-art CNNs and ViT architectures. Our modules can achieve comparable, if not bet
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.1109/TPAMI.2022.3231971

Authors



Publisher:
IEEE
Host title:
Proceedings of the 2023 IEEE Conference of Computer Vision and Pattern Recognition (CVPR 2023)
Publication date:
2023-03-20
Acceptance date:
2023-02-27
Event title:
IEEE Conference of Computer Vision and Pattern Recognition (CVPR 2023)
Event location:
Vancouver, Canada
Event website:
https://ibug.doc.ic.ac.uk/media/uploads/documents/linear_complexity_self-attention_with_3textrd_order_polynomials.pdf
Event start date:
2023-06-18
Event end date:
2023-07-22
DOI:
EISSN:
1939-3539
ISSN:
0162-8828


Language:
English
Keywords:
Pubs id:
1495101
Local pid:
pubs:1495101
Deposit date:
2023-07-24

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP