Distance Weighted Loss for Forest Trail Detection Using Semantic Line

Author(s):  
Shyam Prasad Adhikari ◽  
Hyongsuk Kim
2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Zhan-Ning Liu ◽  
Xiao-Yan Yu ◽  
Li-Feng Jia ◽  
Yuan-Sheng Wang ◽  
Yu-Chen Song ◽  
...  

AbstractIn order to study the influence of distance weight on ore-grade estimation, the inverse distance weighted (IDW) is used to estimate the Ni grade and MgO grade of serpentinite ore based on a three-dimensional ore body model and related block models. Manhattan distance, Euclidean distance, Chebyshev distance, and multiple forms of the Minkowski distance are used to calculate distance weight of IDW. Results show that using the Minkowski distance for the distance weight calculation is feasible. The law of the estimated results along with the distance weight is given. The study expands the distance weight calculation method in the IDW method, and a new method for improving estimation accuracy is given. Researchers can choose different weight calculation methods according to their needs. In this study, the estimated effect is best when the power of the Minkowski distance is 3 for a 10 m × 10 m × 10 m block model. For a 20 m × 20 m × 20 m block model, the estimated effect is best when the power of the Minkowski distance is 9.


2017 ◽  
Vol 51 (2) ◽  
pp. 85-96 ◽  
Author(s):  
Gang Xiong ◽  
Jiming Lan ◽  
Haiyan Zhang ◽  
Tian-Huai Ding

2016 ◽  
Vol 55 (2) ◽  
pp. 283-296 ◽  
Author(s):  
Yongxin Deng ◽  
Brendan Wallace ◽  
Derek Maassen ◽  
Johnathan Werner

AbstractA geographical information system (GIS) perspective is taken to examine conceptual and methodological complications present in tornado density and probability mapping. Tornado density is defined as the inverse-distance-weighted count of tornado touchdown points or tornado-affected cells within a neighborhood area. The paper first adds a few geographic elements into the tornado definition and then characterizes tornado density as a density field in GIS that depends on predefined, modifiable areas to exist. Tornado density is therefore conceptually distinguished from both individual tornadoes and tornado probability. Three factors are identified to be vital in tornado density mapping: the neighborhood size, the distance decay function, and the choice of tornado properties. Correspondingly, 12 neighborhood sizes ranging from 20 to 360 km are tested, four distance decay functions are compared, and two tornado properties—tornado touchdown locations and pathlengths—are separately incorporated in mapping. GIS interpretations, clarifications, and demonstrations are provided for these factors to reach a thorough understanding of how the factors function and affect the resultant tornado density maps. Historical tornado data of the eastern half of the United States from 1973 to 2013 are used in these demonstrations. Uncertainty and propagation analyses are recommended for future tornado density and probability mapping, and a Monte Carlo simulation using tornado pathlength data is conducted as an example of uncertainty modeling. In all, tornado density mapping is diagnosed as a largely subjective activity, and the mapper needs to make multiple choices according to the mapping purpose, scale, and the involved tornado record data.


2020 ◽  
Vol 2020 ◽  
pp. 1-22
Author(s):  
Byung-Kwon Son ◽  
Do-Jin An ◽  
Joon-Ho Lee

In this paper, a passive localization of the emitter using noisy angle-of-arrival (AOA) measurements, called Brown DWLS (Distance Weighted Least Squares) algorithm, is considered. The accuracy of AOA-based localization is quantified by the mean-squared error. Various estimates of the AOA-localization algorithm have been derived (Doğançay and Hmam, 2008). Explicit expression of the location estimate of the previous study is used to get an analytic expression of the mean-squared error (MSE) of one of the various estimates. To validate the derived expression, we compare the MSE from the Monte Carlo simulation with the analytically derived MSE.


2017 ◽  
Vol 23 (3) ◽  
pp. 493-508 ◽  
Author(s):  
Italo Oliveira Ferreira ◽  
Dalto Domingos Rodrigues ◽  
Gérson Rodrigues dos Santos ◽  
Lidiane Maria Ferraz Rosa

Abstract: The representation of the submerged relief is very importance in diverse areas of knowledge such as Projects to build or reassess port dimensions, installation of moles, ducts, marinas, bridges, tunnels, mineral prospecting, waterways, dredging, silting control of river and lakes, and others. The depths of the aquatic bodies, indispensable for the representation of those, are obtained through the bathymetric surveys. However, the result of a bathymetric sampling is a grid of points that, for itself, it is not capable of generating directly the Digital Model of Depth (DMD), being necessary the use of interpolators. Currently, there are more than 40 available scientific methods of interpolation, each one with its particularities and characteristics. This study has the objective to analise, comparing, the efficiency of Universal Kriging (UK) and of the Inverse Distance Weighted (IDW) in the computational representation of bathymetric surfaces, varying in a decreasing way the quantity of sample points. Through the results, we can be stated the superiority of the interpolator Universal Kriging in efficiency in creating DMD with basis in the bathymetric surveys data.


10.29007/5gzr ◽  
2018 ◽  
Author(s):  
Cezary Kaliszyk ◽  
Josef Urban

Two complementary AI methods are used to improve the strength of the AI/ATP service for proving conjectures over the HOL Light and Flyspeck corpora. First, several schemes for frequency-based feature weighting are explored in combination with distance-weighted k-nearest-neighbor classifier. This results in 16% improvement (39.0% to 45.5% Flyspeck problems solved) of the overall strength of the service when using 14 CPUs and 30 seconds. The best premise-selection/ATP combination is improved from 24.2% to 31.4%, i.e. by 30%. A smaller improvement is obtained by evolving targetted E prover strategies on two particular premise selections, using the Blind Strategymaker (BliStr) system. This raises the performance of the best AI/ATP method from 31.4% to 34.9%, i.e. by 11%, and raises the current 14-CPU power of the service to 46.9%.


2019 ◽  
Vol 8 (4) ◽  
pp. 418-427
Author(s):  
Eko Siswanto ◽  
Hasbi Yasin ◽  
Sudarno Sudarno

In many applications, several time series data are recorded simultaneously at a number of locations. Time series data from nearby locations often to be related by spatial and time. This data is called spatial time series data. Generalized Space Time Autoregressive (GSTAR) model is one of space time models used to modeling and forecasting spatial time series data. This study applied GTSAR model to modeling volume of rainfall four locations in Jepara Regency, Kudus Regency, Pati Regency, and Grobogan Regency. Based on the smallest RMSE mean of forecasting result, the best model chosen by this study is GSTAR (11)-I(1)12 with the inverse distance weighted. Based on GSTAR(11)-I(1)12 with the inverse distance weighted, the relationship between the location shown on rainfall Pati Regency influenced by the rainfall in other regencies. Keywords: GSTAR, RMSE, Rainfall


Sign in / Sign up

Export Citation Format

Share Document