scholarly journals Stochastic analysis of the mean interference for the RTS/CTS mechanism

Author(s):  
Yi Zhong ◽  
Wenyi Zhang ◽  
Martin Haenggi
Keyword(s):  
1988 ◽  
Vol 110 (1) ◽  
pp. 23-29 ◽  
Author(s):  
Da Yu Tzou

Stochastic temperature distribution in a solid medium with random heat conductivity is investigated by the method of perturbation. The intrinsic randomness of the thermal conductivity k(x) is considered to be a distribution function with random amplitude in the solid, and several typical stochastic processes are considered in the numerical examples. The formulation used in the present analysis describes a situation that the statistical orders of the random response of the system are the same as those of the intrinsic random excitations, which is characteristic for the problem with extrinsic randomness. The maximum standard deviation of the temperature distribution from the mean value in the solid medium reveals the amount of unexpected energy experienced by the solid continuum, which should be carefully inspected in the thermal-failure design of structures with intrinsic randomness.


2012 ◽  
Vol 16 (3) ◽  
pp. 641-648 ◽  
Author(s):  
C.-M. Chang ◽  
H.-D. Yeh

Abstract. Owing to the analogy between the solute and heat transport processes, it can be expected that the rate of growth of the spatial second moments of the heat flux in a heterogeneous aquifer over relatively large space scales is greater than that predicted by applying the classical heat transport model. The motivation of stochastic analysis of heat transport at the field scale is therefore to quantify the enhanced growth of the field-scale second moments caused by the spatially varying specific discharge field. Within the framework of stochastic theory, an effective advection-dispersion equation containing effective parameters (namely, the macrodispersion coefficients) is developed to model the mean temperature field. The rate of growth of the field-scale spatial second moments of the mean temperature field in the principal coordinate directions is described by the macrodispersion coefficient. The variance of the temperature field is also developed to characterize the reliability to be anticipated in applying the mean heat transport model. It is found that the heterogeneity of the medium and the correlation length of the log hydraulic conductivity are important in enhancing the field-scale heat advection, while the effective thermal conductivity plays the role in reducing the field-scale heat advection.


Author(s):  
Dan Trietsch

Crashing stochastic activities implies changing their distributions to reduce the mean. This can involve changing the variance too. Therefore, crashing can change not only the expected duration of a project but also the necessary size of its safety buffer. We consider optimal crashing of serial projects where the objective is to minimize total costs including crashing cost and expected delay penalty. As part of the solution we determine optimal safety buffers. They allow for activities that are statistically dependent because they share an error element (e.g., when all durations have been estimated by one person, when weather or general economic conditions influence many activities, etc). We show that under plausible conditions the problem is convex and thus it can be solved by standard numerical search procedures. The purpose of the paper is to encourage software development that will include valid stochastic analysis for scheduling and crashing using current estimates and historical performance records.


2011 ◽  
Vol 8 (6) ◽  
pp. 10311-10331
Author(s):  
C.-M. Chang ◽  
H.-D. Yeh

Abstract. Owing to the analogy between the solute and heat transport processes, it can be expected that the rate of growth of the spatial second moments of the heat flux in a heterogeneous aquifer over relatively large space scales is greater than that predicted by applying the classical heat transport model. The motivation of stochastic analysis of heat transport at the field scale is therefore to quantify the enhanced growth of the field-scale second moments caused by the spatially varying specific discharge field. Within the framework of stochastic theory, an effective advection-dispersion equation containing effective parameters (namely, the macrodispersion coefficients) is developed to model the mean temperature field. The rate of growth of the field-scale spatial second moments of the mean temperature field in the principal coordinate directions is described by the macrodispersion coefficient. The variance of the temperature field is also developed to characterize the reliability to be anticipated in applying the mean heat transport model. It is found that the heterogeneity of the medium and the correlation length of the log hydraulic conductivity are important in enhancing the field-scale heat advection, while the effective thermal conductivity plays the role in reducing the field-scale heat advection.


2019 ◽  
Vol 2019 ◽  
pp. 1-15
Author(s):  
Tianqing Yang ◽  
Zuoliang Xiong ◽  
Cuiping Yang

This paper is concerned with the mean-square exponential input-to-state stability problem for a class of stochastic Cohen-Grossberg neural networks. Different from prior works, neutral terms and mixed delays are discussed in our system. By employing the Lyapunov-Krasovskii functional method, Itô formula, Dynkin formula, and stochastic analysis theory, we obtain some novel sufficient conditions to ensure that the addressed system is mean-square exponentially input-to-state stable. Moreover, two numerical examples and their simulations are given to illustrate the correctness of the theoretical results.


2020 ◽  
Author(s):  
Leonardo dos Santos Lima

Abstract We propose a stochastic model for epidemic spreading of the novel coronavirus based in data supported by the Brazilian health agencies. Furthermore, we performed an analysis using the Fokker-Planck equation estimating the novel cases in the day t as the mean half-width of the distribution of novel cases P(N,t). Our results display that the model based in the Itô diffusion adjusts well to the results supplied by health Brazilian agencies due to large uncertain in the official data and to the low number of tests realized in the population.


2012 ◽  
pp. 484-495
Author(s):  
Dan Trietsch

Crashing stochastic activities implies changing their distributions to reduce the mean. This can involve changing the variance too. Therefore, crashing can change not only the expected duration of a project but also the necessary size of its safety buffer. We consider optimal crashing of serial projects where the objective is to minimize total costs including crashing cost and expected delay penalty. As part of the solution we determine optimal safety buffers. They allow for activities that are statistically dependent because they share an error element (e.g., when all durations have been estimated by one person, when weather or general economic conditions influence many activities, etc). We show that under plausible conditions the problem is convex and thus it can be solved by standard numerical search procedures. The purpose of the paper is to encourage software development that will include valid stochastic analysis for scheduling and crashing using current estimates and historical performance records.


Author(s):  
Dan Trietsch

Crashing stochastic activities implies changing their distributions to reduce the mean. This can involve changing the variance too. Therefore, crashing can change not only the expected duration of a project but also the necessary size of its safety buffer. We consider optimal crashing of serial projects where the objective is to minimize total costs including crashing cost and expected delay penalty. As part of the solution we determine optimal safety buffers. They allow for activities that are statistically dependent because they share an error element (e.g., when all durations have been estimated by one person, when weather or general economic conditions influence many activities, etc). We show that under plausible conditions the problem is convex and thus it can be solved by standard numerical search procedures. The purpose of the paper is to encourage software development that will include valid stochastic analysis for scheduling and crashing using current estimates and historical performance records.


2011 ◽  
Vol 255-260 ◽  
pp. 1023-1028
Author(s):  
Hua Guan ◽  
De Wei Chen

There is a deviation between finished bridge and design requirement of prestressed concrete (PC) cable-stayed bridge due to the construction errors in construction process. Besides, the randomness of construction errors will cause the variability of structure system in construction state. However, the most common analyses considering construction errors are based on deterministic analysis method, which concern the mean errors effect merely. These are inadequate obviously. In this paper the main construction errors of PC cable-stayed bridge and corresponding distributions are outlined firstly. Also several common stochastic analysis methods are introduced. Using Monte-Carlo method with ANSYS finite element analysis (FEA) software, then, a stochastic analysis of a single tower PC cable-stayed bridge considering construction errors is completed, and the probability characteristics of structural response under dead load are obtained. Finally, from the reliability point of view, the influence of randomness of construction errors on structural safety and the necessity of stochastic analysis for PC cable-stayed bridges are discussed.


1966 ◽  
Vol 24 ◽  
pp. 170-180
Author(s):  
D. L. Crawford

Early in the 1950's Strömgren (1, 2, 3, 4, 5) introduced medium to narrow-band interference filter photometry at the McDonald Observatory. He used six interference filters to obtain two parameters of astrophysical interest. These parameters he calledlandc, for line and continuum hydrogen absorption. The first measured empirically the absorption line strength of Hβby means of a filter of half width 35Å centered on Hβand compared to the mean of two filters situated in the continuum near Hβ. The second index measured empirically the Balmer discontinuity by means of a filter situated below the Balmer discontinuity and two above it. He showed that these two indices could accurately predict the spectral type and luminosity of both B stars and A and F stars. He later derived (6) an indexmfrom the same filters. This index was a measure of the relative line blanketing near 4100Å compared to two filters above 4500Å. These three indices confirmed earlier work by many people, including Lindblad and Becker. References to this earlier work and to the systems discussed today can be found in Strömgren's article inBasic Astronomical Data(7).


Sign in / Sign up

Export Citation Format

Share Document