• Remote Sensing Technology and Application
  • Vol. 39, Issue 3, 590 (2024)
Zhaohua HU and Yuhui LI
Author Affiliations
  • School of Electronic and Information Engineering,Nanjing University of Information Science and Technology,Nanjing210044,China
  • show less
    DOI: 10.11873/j.issn.1004-0323.2024.3.0590 Cite this Article
    Zhaohua HU, Yuhui LI. Object Detection in Remote Sensing Images based on YOLOX-Tiny Biased Feature Fusion Network[J]. Remote Sensing Technology and Application, 2024, 39(3): 590 Copy Citation Text show less

    Abstract

    Remote sensing target detection is of great significance in fields such as environmental monitoring and circuit inspection. However, there are challenges in remote sensing images, such as large differences in target scale, a large number of small targets, high inter class similarity and intra class diversity, which lead to low detection accuracy. To solve the above problems, a remote sensing target detection model based on YOLOX-Tiny is proposed. Firstly, by improving the multi-scale feature fusion network to fully utilize shallow detail information and deep semantic information, the detection ability for small targets is enhanced; Secondly, deformable convolution is introduced at the prediction end to improve the robustness of the model to targets of different scales and shapes; Finally, the SIoU loss function is used to move the prediction box in the correct direction, further improving the positioning accuracy of the model. Experiments are conducted on remote sensing datasets DIOR and RSOD, and the experimental results show that without increasing the number of parameters, the improved model achieves a detection accuracy of 73.68% and 97.12%, respectively, which is high compared to some other state-of-the-art models, with a high recognition rate of overlapping targets and good real-time performance.