multivariate systems
Recently Published Documents


TOTAL DOCUMENTS

103
(FIVE YEARS 22)

H-INDEX

14
(FIVE YEARS 1)

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 994
Author(s):  
Heba Elsegai

Detecting causal interrelationships in multivariate systems, in terms of the Granger-causality concept, is of major interest for applications in many fields. Analyzing all the relevant components of a system is almost impossible, which contrasts with the concept of Granger causality. Not observing some components might, in turn, lead to misleading results, particularly if the missing components are the most influential and important in the system under investigation. In networks, the importance of a node depends on the number of nodes connected to this node. The degree of centrality is the most commonly used measure to identify important nodes in networks. There are two kinds of degree centrality, which are in-degree and out-degree. This manuscrpt is concerned with finding the highest out-degree among nodes to identify the most influential nodes. Inferring the existence of unobserved important components is critical in many multivariate interacting systems. The implications of such a situation are discussed in the Granger-causality framework. To this end, two of the most recent Granger-causality techniques, renormalized partial directed coherence and directed partial correlation, were employed. They were then compared in terms of their performance according to the extent to which they can infer the existence of unobserved important components. Sub-network analysis was conducted to aid these two techniques in inferring the existence of unobserved important components, which is evidenced in the results. By comparing the results of the two conducted techniques, it can be asserted that renormalized partial coherence outperforms directed partial correlation in the inference of existing unobserved important components that have not been included in the analysis. This measure of Granger causality and sub-network analysis emphasizes their ubiquitous successful applicability in such cases of the existence of hidden unobserved important components.


Energies ◽  
2021 ◽  
Vol 14 (3) ◽  
pp. 619
Author(s):  
En-Jui Liu ◽  
Yi-Hsuan Hung ◽  
Che-Wun Hong

As carriers of green energy, proton exchange membrane fuel cells (PEMFCs) and photovoltaic (PV) cells are complex and nonlinear multivariate systems. For simulation analysis, optimization control, efficacy prediction, and fault diagnosis, it is crucial to rapidly and accurately establish reliability modules and extract parameters from the system modules. This study employed three types of particle swarm optimization (PSO) algorithms to find the optimal parameters of two energy models by minimizing the sum squared errors (SSE) and roots mean squared errors (RMSE). The three algorithms are inertia weight PSO, constriction PSO, and momentum PSO. The obtained calculation results of these three algorithms were compared with those obtained using algorithms from other relevant studies. This study revealed that the use of momentum PSO enables rapid convergence (under 30 convergence times) and the most accurate modeling and yields the most stable parameter extraction (SSE of PEMFC is 2.0656, RMSE of PV cells is 8.839 · 10−4). In summary, momentum PSO is the algorithm that is most suitable for system parameter identification with multiple dimensions and complex modules.


2021 ◽  
Vol 54 (7) ◽  
pp. 345-350
Author(s):  
Mengyuan Fang ◽  
Miguel Galrinho ◽  
Håkan Hjalmarsson

Author(s):  
Gokce Ozdes ◽  
Yakup Kutlu

Iron production in the iron and steel industry is a process that starts with the melting of scrap in electric arc furnaces or iron ore in basic oxygen furnaces. The proportions of the alloys in the liquid steel obtained from the liquid steel obtained by melting scrap are of great importance in order to produce the desired quality iron. In steel production, it is necessary to reduce the carbon rate to the desired level, to reduce the proportions of manganese, silicon and other chemicals to the values prescribed in the prescription, and to remove sulfur from liquid steel as much as possible. Therefore, alloys are added (FeSiMnPOTP, AltelPOTP, GrnKrbnPOTP, FeMnOrtCPOTP, KirecPOTP, FeSiPOTP, AlPOTP, FlşptPOTP etc.). Each alloy added has a chemical that acts. For example; If it is desired to change the aluminum ratio of liquid steel, AltelPOTP alloy is added. In the analysis results, it is observed that the aluminum ratios have changed. The liquid steel transferred to the ladle furnace is analyzed at certain intervals and the addition of chemical alloys continues until the required ratios are obtained. Chemical alloys added to liquid steel should not be less or more than they should be, in terms of both material and quality standards. Because the mentioned alloys are serious cost items when purchased in dollars and spread over a long term. For this reason, the rates should be adjusted very accurately. All these metallurgical processes are complex, multivariate systems. Looking at the examinations made, it is seen that while the alloys to be added to the liquid steel in the ladle furnace are rehearsed for an average of 4 times in a casting, this process is repeated at least 2 and at most 6 times. Taking samples from the liquid steel in the ladle furnace, sending the sample for chemical analysis, obtaining the result of chemical analysis and repeating these processes if the desired quality standards are not obtained, the average time is 45 minutes. These periods cause serious waste of time. For this reason, the time of the next casting has to be started later than the planned time. This causes delay in the subsequent processes (pouring liquid steel into molds in continuous casting, forming in the rolling mill, passing through quality tests, etc.). Today, with the advancement of technology, the use of artificial intelligence in the iron and steel industry will be a mandatory approach to minimize the number of proofs and minimize the loss of material and temporal labor.


2020 ◽  
Vol 16 (12) ◽  
pp. e1008289
Author(s):  
Fernando E. Rosas ◽  
Pedro A. M. Mediano ◽  
Henrik J. Jensen ◽  
Anil K. Seth ◽  
Adam B. Barrett ◽  
...  

The broad concept of emergence is instrumental in various of the most challenging open scientific questions—yet, few quantitative theories of what constitutes emergent phenomena have been proposed. This article introduces a formal theory of causal emergence in multivariate systems, which studies the relationship between the dynamics of parts of a system and macroscopic features of interest. Our theory provides a quantitative definition of downward causation, and introduces a complementary modality of emergent behaviour—which we refer to as causal decoupling. Moreover, the theory allows practical criteria that can be efficiently calculated in large systems, making our framework applicable in a range of scenarios of practical interest. We illustrate our findings in a number of case studies, including Conway’s Game of Life, Reynolds’ flocking model, and neural activity as measured by electrocorticography.


2020 ◽  
Vol 59 (47) ◽  
pp. 20767-20778
Author(s):  
Ling Yan ◽  
Xin Peng ◽  
Chudong Tong ◽  
Lijia Luo

Author(s):  
Maryam Khosroshahi ◽  
Javad Poshtan

Control system performance assessment is significant, especially in practical applications. One of the most important indices for the performance assessment of control systems is minimum variance. Calculating the minimum variance index in multivariate systems requires prior knowledge of system parameters and models, and is therefore an obstacle in practical applications. In this article, an index is proposed for the performance assessment of multivariate control loops, evaluating the system performance with the minimum variance criterion and using only the system’s routine operation data. This index can quantify the performance using neither any prior knowledge of system parameters nor the system’s optimal operation data. The proposed index is based on the Hurst exponent, a parameter for measuring correlations in time series data. In this article, detrended fluctuation analysis and rescaled range analysis are used to estimate the Hurst exponents of system outputs. Using a combination of these Hurst exponents, an index is defined for the performance assessment of multivariate systems. The results of simulation examples illustrate that the proposed index can assess the performance efficiently.


Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1139
Author(s):  
Catherine Kyrtsou ◽  
Christina Mikropoulou ◽  
Angeliki Papana

In financial markets, information constitutes a crucial factor contributing to the evolution of the system, while the presence of heterogeneous investors ensures its flow among financial products. When nonlinear trading strategies prevail, the diffusion mechanism reacts accordingly. Under these conditions, information englobes behavioral traces of traders’ decisions and represents their actions. The resulting effect of information endogenization leads to the revision of traders’ positions and affects connectivity among assets. In an effort to investigate the computational dimensions of this effect, we first simulate multivariate systems including several scenarios of noise terms, and then we apply direct causality tests to analyze the information flow among their variables. Finally, empirical evidence is provided in real financial data.


Sign in / Sign up

Export Citation Format

Share Document