scholarly journals Building Damage Detection Using U-Net with Attention Mechanism from Pre- and Post-Disaster Remote Sensing Datasets

2021 ◽  
Vol 13 (5) ◽  
pp. 905
Author(s):  
Chuyi Wu ◽  
Feng Zhang ◽  
Junshi Xia ◽  
Yichen Xu ◽  
Guoqing Li ◽  
...  

The building damage status is vital to plan rescue and reconstruction after a disaster and is also hard to detect and judge its level. Most existing studies focus on binary classification, and the attention of the model is distracted. In this study, we proposed a Siamese neural network that can localize and classify damaged buildings at one time. The main parts of this network are a variety of attention U-Nets using different backbones. The attention mechanism enables the network to pay more attention to the effective features and channels, so as to reduce the impact of useless features. We train them using the xBD dataset, which is a large-scale dataset for the advancement of building damage assessment, and compare their result balanced F (F1) scores. The score demonstrates that the performance of SEresNeXt with an attention mechanism gives the best performance, with the F1 score reaching 0.787. To improve the accuracy, we fused the results and got the best overall F1 score of 0.792. To verify the transferability and robustness of the model, we selected the dataset on the Maxar Open Data Program of two recent disasters to investigate the performance. By visual comparison, the results show that our model is robust and transferable.

2021 ◽  
pp. 102831532110527
Author(s):  
Davina Potts ◽  
Jeongeun Kim

While participation in learning abroad has increased rapidly over the last decade, short-term programs played an important role in boosting participation and widening access to learning abroad. The current study takes advantage of a new pattern of participation in learning abroad to examine self-reported career outcomes and employability development benefits based on program duration and the number of programs undertaken. Using a large-scale dataset of graduates of Australian universities, the study challenges conventional wisdom that a longer experience is better and explores the impact of multiple short-term program participation as a new intervention in graduate career outcomes. Although this study is based on the Australian higher education context, the results may be informative to educators and policy-makers from countries with comparable learning abroad programs in considering how short-term programs can be used more purposefully to foster positive careers and employability outcomes.


Author(s):  
S. Boeke ◽  
M. J. C. van den Homberg ◽  
A. Teklesadik ◽  
J. L. D. Fabila ◽  
D. Riquet ◽  
...  

Abstract. Reliable predictions of the impact of natural hazards turning into a disaster is important for better targeting humanitarian response as well as for triggering early action. Open data and machine learning can be used to predict loss and damage to the houses and livelihoods of affected people. This research focuses on agricultural loss, more specifically rice loss in the Philippines due to typhoons. Regression and binary classification algorithms are trained using feature selection methods to find the most important explanatory features. Both geographical data from every province, and typhoon specific features of 11 historical typhoons are used as input. The percentage of lost rice area is considered as the output, with an average value of 7.1%. As for the regression task, the support vector regressor performed best with a Mean Absolute Error of 6.83 percentage points. For the classification model, thresholds of 20%, 30% and 40% are tested in order to find the best performing model. These thresholds represent different levels of lost rice fields for triggering anticipatory action towards farmers. The binary classifiers are trained to increase its ability to rightly predict the positive samples. In all three cases, the support vector classifier performed the best with a recall score of 88%, 75% and 81.82%, respectively. However, the precision score for each of these models was low: 17.05%, 14.46% and 10.84%, respectively. For both the support vector regressor and classifier, of all 14 available input features, only wind speed was selected as explanatory feature. Yet, for the other algorithms that were trained in this study, other sets of features were selected depending also on the hyperparameter settings. This variation in selected feature sets as well as the imprecise predictions were consequences of the small dataset that was used for this study. It is therefore important that data for more typhoons as well as data on other explanatory variables are gathered in order to make more robust and accurate predictions. Also, if loss data becomes available on municipality-level, rather than province-level, the models will become more accurate and valuable for operationalization.


Author(s):  
A. Calantropio ◽  
F. Chiabrando ◽  
M. Codastefano ◽  
E. Bourke

Abstract. During the last few years, the technical and scientific advances in the Geomatics research field have led to the validation of new mapping and surveying strategies, without neglecting already consolidated practices. The use of remote sensing data for damage assessment in post-disaster scenarios underlined, in several contexts and situations, the importance of the Geomatics applied techniques for disaster management operations, and nowadays their reliability and suitability in environmental emergencies is globally recognized. In this paper, the authors present their experiences in the framework of the 2016 earthquake in Central Italy and the 2019 Cyclone Idai in Mozambique. Thanks to the use of image-based survey techniques as the main acquisition methods (UAV photogrammetry), damage assessment analysis has been carried out to assess and map the damages that occurred in Pescara del Tronto village, using DEEP (Digital Engine for Emergency Photo-analysis) a deep learning tool for automatic building footprint segmentation and building damage classification, functional to the rapid production of cartography to be used in emergency response operations. The performed analyses have been presented, and the strengths and weaknesses of the employed methods and techniques have been outlined. In conclusion and based on the authors' experience, some operational suggestions and best practices are provided and future research perspectives within the same research topic are introduced.


2018 ◽  
Vol 13 (7) ◽  
pp. 1257-1271
Author(s):  
Erick Mas ◽  
Daniel Felsenstein ◽  
Luis Moya ◽  
A. Yair Grinberger ◽  
Rubel Das ◽  
...  

The DIM2SEA research project aims to increase urban resilience to large-scale disasters. We are engaged in developing a prototype Dynamic Integrated Model for Disaster Management and Socioeconomic Analysis (DIM2SEA) that will give disaster officials, stakeholders, urban engineers and planners an analytic tool for mitigating some of the worst excesses of catastrophic events. This is achieved by harnessing state-of-the-art developments in damage assessment, spatial simulation modeling, and Geographic Information System (GIS). At the heart of DIM2SEA is an agent-based model combined with post-disaster damage assessment and socioeconomic impact models. The large amounts of simulated spatial and temporal data generated by the agent-based models are fused with the socioeconomic profiles of the target population to generate a multidimensional database of inherently “synthetic” big data. Progress in the following areas is reported here: (1) Synthetic population generation from census tract data into agent profiling and spatial allocation, (2) developing scenarios of building damage due to earthquakes and tsunamis, (3) building debris scattering estimation and road network disruption, (4) logistics regarding post-disaster relief distribution, (5) the labor market in post-disaster urban dynamics, and (6) household insurance behavior as a reflection of urban resilience.


2021 ◽  
Vol 13 (5) ◽  
pp. 114
Author(s):  
Stefan Helmstetter ◽  
Heiko Paulheim

The problem of automatic detection of fake news in social media, e.g., on Twitter, has recently drawn some attention. Although, from a technical perspective, it can be regarded as a straight-forward, binary classification problem, the major challenge is the collection of large enough training corpora, since manual annotation of tweets as fake or non-fake news is an expensive and tedious endeavor, and recent approaches utilizing distributional semantics require large training corpora. In this paper, we introduce an alternative approach for creating a large-scale dataset for tweet classification with minimal user intervention. The approach relies on weak supervision and automatically collects a large-scale, but very noisy, training dataset comprising hundreds of thousands of tweets. As a weak supervision signal, we label tweets by their source, i.e., trustworthy or untrustworthy source, and train a classifier on this dataset. We then use that classifier for a different classification target, i.e., the classification of fake and non-fake tweets. Although the labels are not accurate according to the new classification target (not all tweets by an untrustworthy source need to be fake news, and vice versa), we show that despite this unclean, inaccurate dataset, the results are comparable to those achieved using a manually labeled set of tweets. Moreover, we show that the combination of the large-scale noisy dataset with a human labeled one yields more advantageous results than either of the two alone.


SOCIOTECHNICA ◽  
2014 ◽  
Vol 11 (0) ◽  
pp. 12-21
Author(s):  
Makoto FUJIU ◽  
Miho OHARA ◽  
Kimiro MEGURO

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Yong Shi ◽  
Wei Dai ◽  
Wen Long ◽  
Bo Li

The liquidity risk factor of security market plays an important role in the formulation of trading strategies. A more liquid stock market means that the securities can be bought or sold more easily. As a sound indicator of market liquidity, the transaction duration is the focus of this study. We concentrate on estimating the probability density function p Δ t i + 1 | G i , where Δ t i + 1 represents the duration of the (i + 1)-th transaction and G i represents the historical information at the time when the (i + 1)-th transaction occurs. In this paper, we propose a new ultrahigh-frequency (UHF) duration modelling framework by utilizing long short-term memory (LSTM) networks to extend the conditional mean equation of classic autoregressive conditional duration (ACD) model while retaining the probabilistic inference ability. And then, the attention mechanism is leveraged to unveil the internal mechanism of the constructed model. In order to minimize the impact of manual parameter tuning, we adopt fixed hyperparameters during the training process. The experiments applied to a large-scale dataset prove the superiority of the proposed hybrid models. In the input sequence, the temporal positions which are more important for predicting the next duration can be efficiently highlighted via the added attention mechanism layer.


Author(s):  
X. Yuan ◽  
S. M. Azimi ◽  
C. Henry ◽  
V. Gstaiger ◽  
M. Codastefano ◽  
...  

Abstract. After a natural disaster or humanitarian crisis, rescue forces and relief organisations are dependent on fast, area-wide and accurate information on the damage caused to infrastructure and the situation on the ground. This study focuses on the assessment of building damage levels on optical satellite imagery with a two-step ensemble model performing building segmentation and damage classification trained on a public dataset. We provide an extensive generalization study on pre- and post-disaster data from the passage of the cyclone Idai over Beira, Mozambique, in 2019 and the explosion in Beirut, Lebanon, in 2020. Critical challenges are addressed, including the detection of clustered buildings with uncommon visual appearances, the classification of damage levels by both humans and deep learning models, and the impact of varying imagery acquisition conditions. We show promising building damage assessment results and highlight the strong performance impact of data pre-processing on the generalization capability of deep convolutional models.


Sign in / Sign up

Export Citation Format

Share Document