• Optics and Precision Engineering
  • Vol. 29, Issue 9, 2222 (2021)
Xiao-dong MU1, Kun BAI1,*, Xuan-ang YOU1, Yong-qing ZHU1, and Xue-bing CHEN2
Author Affiliations
  • 1College of Operational Support, Rocket Force University of Engineering, Xi’an70025, China
  • 2Unit 61068, Xi’an710100, China
  • show less
    DOI: 10.37188/OPE.20212909.2222 Cite this Article
    Xiao-dong MU, Kun BAI, Xuan-ang YOU, Yong-qing ZHU, Xue-bing CHEN. Remote sensing image feature extraction and classification based on contrastive learning method[J]. Optics and Precision Engineering, 2021, 29(9): 2222 Copy Citation Text show less

    Abstract

    To solve the problem of lack of labeled data in the feature extraction and classification from remote sensing images using deep learning, a simple contrastive learning method involving the use of an asymmetric predictor was proposed. First, the input image is enhanced using horizontal flipping, color jitter, and grayscale methods to obtain two related views of the same image. Subsequently, they are fed into the two branches of a Siamese network for feature extraction. Next, asymmetric predictors are used to transform the features, and the network is optimized by maximizing the similarity between them. Finally, a linear classifier is trained by fixing its parameters to complete the feature classification. When 20% of the labeled samples are used for fine-tuning in the four public remote sensing image datasets, NWPU-Resisc45, EuroSAT, UC Merced, and Siri-WHU, the classification accuracies of the experiments are 77.57%, 87.70%, 60.52%, and 65.83%, respectively. Our proposed method can effectively extract the high-level semantic features of remote sensing images without using data labels and has better performance than the ImageNet pre-trained model and the latest contrastive learning method SimSiam under the conditions of insufficient number of labeled samples.
    Xiao-dong MU, Kun BAI, Xuan-ang YOU, Yong-qing ZHU, Xue-bing CHEN. Remote sensing image feature extraction and classification based on contrastive learning method[J]. Optics and Precision Engineering, 2021, 29(9): 2222
    Download Citation