Multi-task contrastive learning for change detection in remote sensing images.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Author(s): Liu, Yingying (AUTHOR); Zhou, Gang (AUTHOR)
  • Source:
    Remote Sensing Letters. Jun2024, Vol. 15 Issue 6, p580-590. 11p.
  • Additional Information
    • Subject Terms:
    • Abstract:
      This letter proposes a novel multi-task contrastive learning (MTCL) approach for change detection of high-resolution remote sensing images. Current self-supervised learning approaches have limitations in utilizing multiview information of remote sensing images, and they cannot directly train the change detection backbone network for feature learning. The investigated method exploits multiview information contained in remote sensing images to directly train the change detection backbone network. Specifically, multiview views are constructed from high-resolution remote sensing images by handcrafted feature extraction approach, which contain complementary information from different domains. Then each constructed view is utilized for contrastive learning, and the multi-task learning strategy of parameter sharing is used to learn more rich and robust information. Furthermore, instead of learning feature representations from single remote sensing image, the proposed approach directly utilizes image pairs to train the change detection backbone, further improving the performance of change detection network. Two benchmark datasets are employed for comparative experiments, and experimental results certify the effectiveness and superiority of the investigated approach. [ABSTRACT FROM AUTHOR]
    • Abstract:
      Copyright of Remote Sensing Letters is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)