University of Twente Student Theses

Login

Deep learning based multi-source data fusion to map deforested areas in Amazon rain forest

Chhibber, A. (2022) Deep learning based multi-source data fusion to map deforested areas in Amazon rain forest.

[img] PDF
7MB
Abstract:Accurate deforestation mapping can provide useful information for efficient forest management. Frequent cloud cover often hampers deforestation mapping in tropical forests when only using an optical image. Since optical remote sensing is ineffective in cloudy weather conditions, a possible alternative is the use of all-day and all-weather Synthetic Aperture Radar (SAR). This study aims to overcome this limitation of the optical image by fusing optical (Sentinel-2) and SAR (Sentinel-1) images. With that, we aim to improve deforestation detection through Deep Learning (DL) based late fusion, using as a test site an area in Pará State, Brazil. We compared the accuracies of the deforestation maps generated for the year 2020 from standalone optical and SAR images with maps predicted using late fusion which includes both Sentinel-1 and Sentinel-2 sensor data as input. Results showcased that deforestation mapping using the combination of optical and SAR sensor data improved the overall classification accuracy which was also verified using McNemar’s statistical significance test. For cloud-free image, Sentinel-1/Sentinel-2 based late fusion provided an overall accuracy of 0.97, 0.94, and 0.91 on the full image, test set-1, and test set-2 respectively, while in the cloudy image, Sentinel-1/Sentinel-2 based late fusion provided an overall classification accuracy of 0.95, 0.91 and 0.88 respectively. In the case of a cloud-free image, the overall accuracy using Sentinel-1/Sentinel-2 based late fusion was +3%, +3%, and +3% higher for full image, test set-1, and test set-2 respectively than Sentinel-1 image and +2% and +1% higher for full image and test set-2 than Sentinel-2 image. In case of cloudy weather condition, the overall accuracy of late fusion using both Sentinel-1/Sentinel-2 image was +1%, +2% and +1% higher for full image, test set-1 and test set-2 respectively than Sentinel-1 image and +10%, +2% and +10% respectively higher than Sentinel-2 cloudy image. The presented approach using late fusion showed the advantage of fusing Sentinel-1 and Sentinel-2 sensor data for deforestation mapping compared to the standalone data source. Also, the results show significant benefits of fusing both Sentinel-1 and Sentinel-2 images even in the cloudy weather condition where 22-48% of the study area was c overed with clouds in Sentinel-2 data.
Item Type:Essay (Master)
Faculty:ITC: Faculty of Geo-information Science and Earth Observation
Subject:74 (human) geography, cartography, town and country planning, demography
Programme:Geoinformation Science and Earth Observation MSc (75014)
Link to this item:https://purl.utwente.nl/essays/91435
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page