Deep-learning-based burned area mapping using the synergy of Sentinel-1&2 data

2021 ◽  
Vol 264 ◽  
pp. 112575
Author(s):  
Qi Zhang ◽  
Linlin Ge ◽  
Ruiheng Zhang ◽  
Graciela Isabel Metternicht ◽  
Zheyuan Du ◽  
...  
2021 ◽  
Vol 13 (8) ◽  
pp. 1509
Author(s):  
Xikun Hu ◽  
Yifang Ban ◽  
Andrea Nascetti

Accurate burned area information is needed to assess the impacts of wildfires on people, communities, and natural ecosystems. Various burned area detection methods have been developed using satellite remote sensing measurements with wide coverage and frequent revisits. Our study aims to expound on the capability of deep learning (DL) models for automatically mapping burned areas from uni-temporal multispectral imagery. Specifically, several semantic segmentation network architectures, i.e., U-Net, HRNet, Fast-SCNN, and DeepLabv3+, and machine learning (ML) algorithms were applied to Sentinel-2 imagery and Landsat-8 imagery in three wildfire sites in two different local climate zones. The validation results show that the DL algorithms outperform the ML methods in two of the three cases with the compact burned scars, while ML methods seem to be more suitable for mapping dispersed burn in boreal forests. Using Sentinel-2 images, U-Net and HRNet exhibit comparatively identical performance with higher kappa (around 0.9) in one heterogeneous Mediterranean fire site in Greece; Fast-SCNN performs better than others with kappa over 0.79 in one compact boreal forest fire with various burn severity in Sweden. Furthermore, directly transferring the trained models to corresponding Landsat-8 data, HRNet dominates in the three test sites among DL models and can preserve the high accuracy. The results demonstrated that DL models can make full use of contextual information and capture spatial details in multiple scales from fire-sensitive spectral bands to map burned areas. Using only a post-fire image, the DL methods not only provide automatic, accurate, and bias-free large-scale mapping option with cross-sensor applicability, but also have potential to be used for onboard processing in the next Earth observation satellites.


2021 ◽  
Vol 260 ◽  
pp. 112468
Author(s):  
Miguel A. Belenguer-Plomer ◽  
Mihai A. Tanase ◽  
Emilio Chuvieco ◽  
Francesca Bovolo

2020 ◽  
Vol 236 ◽  
pp. 111493 ◽  
Author(s):  
Joshua Lizundia-Loiola ◽  
Gonzalo Otón ◽  
Rubén Ramo ◽  
Emilio Chuvieco

2009 ◽  
Vol 113 (2) ◽  
pp. 408-420 ◽  
Author(s):  
Louis Giglio ◽  
Tatiana Loboda ◽  
David P. Roy ◽  
Brad Quayle ◽  
Christopher O. Justice

Author(s):  
Rubén Ramo ◽  
Mariano García ◽  
Daniel Rodríguez ◽  
Emilio Chuvieco

PLoS ONE ◽  
2020 ◽  
Vol 15 (5) ◽  
pp. e0232962 ◽  
Author(s):  
Fiona Ngadze ◽  
Kudzai Shaun Mpakairi ◽  
Blessing Kavhu ◽  
Henry Ndaimani ◽  
Monalisa Shingirayi Maremba

2020 ◽  
Vol 12 (15) ◽  
pp. 2422
Author(s):  
Lisa Knopp ◽  
Marc Wieland ◽  
Michaela Rättich ◽  
Sandro Martinis

Wildfires have major ecological, social and economic consequences. Information about the extent of burned areas is essential to assess these consequences and can be derived from remote sensing data. Over the last years, several methods have been developed to segment burned areas with satellite imagery. However, these methods mostly require extensive preprocessing, while deep learning techniques—which have successfully been applied to other segmentation tasks—have yet to be fully explored. In this work, we combine sensor-specific and methodological developments from the past few years and suggest an automatic processing chain, based on deep learning, for burned area segmentation using mono-temporal Sentinel-2 imagery. In particular, we created a new training and validation dataset, which is used to train a convolutional neural network based on a U-Net architecture. We performed several tests on the input data and reached optimal network performance using the spectral bands of the visual, near infrared and shortwave infrared domains. The final segmentation model achieved an overall accuracy of 0.98 and a kappa coefficient of 0.94.


Sign in / Sign up

Export Citation Format

Share Document