Kernel-Based Topographic Map Formation by Local Density Modeling

2002 ◽  
Vol 14 (7) ◽  
pp. 1561-1573 ◽  
Author(s):  
Marc M. Van Hulle

We introduce a new learning algorithm for kernel-based topographic map formation. The algorithm generates a gaussian mixture density model by individually adapting the gaussian kernels' centers and radii to the assumed gaussian local input densities.

2005 ◽  
Vol 17 (8) ◽  
pp. 1706-1714 ◽  
Author(s):  
Marc M. Van Hulle

Instead of increasing the order of the Edgeworth expansion of a single gaussian kernel, we suggest using mixtures of Edgeworth-expanded gaussian kernels of moderate order. We introduce a simple closed-form solution for estimating the kernel parameters based on weighted moment matching. Furthermore, we formulate the extension to the multivariate case, which is not always feasible with algebraic density approximation procedures.


2002 ◽  
Vol 14 (8) ◽  
pp. 1887-1906 ◽  
Author(s):  
Marc M. Van Hulle

A new learning algorithm for kernel-based topographic map formation is introduced. The kernel parameters are adjusted individually so as to maximize the joint entropy of the kernel outputs. This is done by maximizing the differential entropies of the individual kernel outputs, given that the map's output redundancy, due to the kernel overlap, needs to be minimized. The latter is achieved by minimizing the mutual information between the kernel outputs. As a kernel, the (radial) incomplete gamma distribution is taken since, for a gaussian input density, the differential entropy of the kernel output will be maximal. Since the theoretically optimal joint entropy performance can be derived for the case of nonoverlapping gaussian mixture densities, a new clustering algorithm is suggested that uses this optimum as its “null” distribution. Finally, it is shown that the learning algorithm is similar to one that performs stochastic gradient descent on the Kullback-Leibler divergence for a heteroskedastic gaussian mixture density model.


1996 ◽  
Vol 5 (9) ◽  
pp. 1293-1302 ◽  
Author(s):  
Xinhua Zhuang ◽  
Yan Huang ◽  
K. Palaniappan ◽  
Yunxin Zhao

Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 518
Author(s):  
Osamu Komori ◽  
Shinto Eguchi

Clustering is a major unsupervised learning algorithm and is widely applied in data mining and statistical data analyses. Typical examples include k-means, fuzzy c-means, and Gaussian mixture models, which are categorized into hard, soft, and model-based clusterings, respectively. We propose a new clustering, called Pareto clustering, based on the Kolmogorov–Nagumo average, which is defined by a survival function of the Pareto distribution. The proposed algorithm incorporates all the aforementioned clusterings plus maximum-entropy clustering. We introduce a probabilistic framework for the proposed method, in which the underlying distribution to give consistency is discussed. We build the minorize-maximization algorithm to estimate the parameters in Pareto clustering. We compare the performance with existing methods in simulation studies and in benchmark dataset analyses to demonstrate its highly practical utilities.


2008 ◽  
Vol 19 (1) ◽  
pp. 91-113 ◽  
Author(s):  
Vivek S. Borkar ◽  
Jervis Pinto ◽  
Tarun Prabhu

2017 ◽  
Vol 17 (20) ◽  
pp. 12269-12302 ◽  
Author(s):  
William T. Ball ◽  
Justin Alsing ◽  
Daniel J. Mortlock ◽  
Eugene V. Rozanov ◽  
Fiona Tummon ◽  
...  

Abstract. Observations of stratospheric ozone from multiple instruments now span three decades; combining these into composite datasets allows long-term ozone trends to be estimated. Recently, several ozone composites have been published, but trends disagree by latitude and altitude, even between composites built upon the same instrument data. We confirm that the main causes of differences in decadal trend estimates lie in (i) steps in the composite time series when the instrument source data changes and (ii) artificial sub-decadal trends in the underlying instrument data. These artefacts introduce features that can alias with regressors in multiple linear regression (MLR) analysis; both can lead to inaccurate trend estimates. Here, we aim to remove these artefacts using Bayesian methods to infer the underlying ozone time series from a set of composites by building a joint-likelihood function using a Gaussian-mixture density to model outliers introduced by data artefacts, together with a data-driven prior on ozone variability that incorporates knowledge of problems during instrument operation. We apply this Bayesian self-calibration approach to stratospheric ozone in 10° bands from 60° S to 60° N and from 46 to 1 hPa (∼ 21–48 km) for 1985–2012. There are two main outcomes: (i) we independently identify and confirm many of the data problems previously identified, but which remain unaccounted for in existing composites; (ii) we construct an ozone composite, with uncertainties, that is free from most of these problems – we call this the BAyeSian Integrated and Consolidated (BASIC) composite. To analyse the new BASIC composite, we use dynamical linear modelling (DLM), which provides a more robust estimate of long-term changes through Bayesian inference than MLR. BASIC and DLM, together, provide a step forward in improving estimates of decadal trends. Our results indicate a significant recovery of ozone since 1998 in the upper stratosphere, of both northern and southern midlatitudes, in all four composites analysed, and particularly in the BASIC composite. The BASIC results also show no hemispheric difference in the recovery at midlatitudes, in contrast to an apparent feature that is present, but not consistent, in the four composites. Our overall conclusion is that it is possible to effectively combine different ozone composites and account for artefacts and drifts, and that this leads to a clear and significant result that upper stratospheric ozone levels have increased since 1998, following an earlier decline.


Author(s):  
Lina Fu ◽  
Jie Fang ◽  
Yunjie Lyu ◽  
Huahui Xie

Freeway control has been increasingly used as an innovative approach to ease traffic congestion, improve traffic safety and reduce exhaust emissions. As an important predictive model involved in freeway control, the predictive performance of METANET greatly influences the effect of freeway control. This paper focuses on modifying the METANET model by modeling the critical density. Firstly, the critical density model is deduced based on the catastrophe theory. Then, the perturbation wave and traveling wave that are obtained using the macro and micro data, respectively, have been developed to modify the above proposed critical density model. Finally, the numerical simulation is established to evaluate the effectiveness of the modified METANET model based on the field data from the realistic motorway network. The results show that overall, the predicted data from the modified METANET model are closer to the field data than those obtained from the original model.


Sign in / Sign up

Export Citation Format

Share Document