• Optoelectronics Letters
  • Vol. 21, Issue 2, 113 (2025)
Huibiao YE, Weiming FAN, Yuping GUO, Xuna WANG, and Dalin ZHOU
DOI: 10.1007/s11801-025-4185-7 Cite this Article
YE Huibiao, FAN Weiming, GUO Yuping, WANG Xuna, ZHOU Dalin. Detection using mask adaptive transformers in unmanned aerial vehicle imagery[J]. Optoelectronics Letters, 2025, 21(2): 113 Copy Citation Text show less

Abstract

Drone photography is an essential building block of intelligent transportation, enabling wide-ranging monitoring, precise positioning, and rapid transmission. However, the high computational cost of transformer-based methods in object detection tasks hinders real-time result transmission in drone target detection applications. Therefore, we propose mask adaptive transformer (MAT) tailored for such scenarios. Specifically, we introduce a structure that supports collaborative token sparsification in support windows, enhancing fault tolerance and reducing computational overhead. This structure comprises two modules: a binary mask strategy and adaptive window self-attention (A-WSA). The binary mask strategy focuses on significant objects in various complex scenes. The A-WSA mechanism is employed to self-attend for balance performance and computational cost to select objects and isolate all contextual leakage. Extensive experiments on the challenging CarPK and VisDrone datasets demonstrate the effectiveness and superiority of the proposed method. Specifically, it achieves a mean average precision (mAP@0.5) improvement of 1.25% over car detector based on you only look once version 5 (CD-YOLOv5) on the CarPK dataset and a 3.75% average precision (AP@0.5) improvement over cascaded zoom-in detector (CZ Det) on the VisDrone dataset.
YE Huibiao, FAN Weiming, GUO Yuping, WANG Xuna, ZHOU Dalin. Detection using mask adaptive transformers in unmanned aerial vehicle imagery[J]. Optoelectronics Letters, 2025, 21(2): 113
Download Citation