Two representations of a conditioned superprocess

Author(s):  
Steven N. Evans

SynopsisWe consider a class of measure-valued Markov processes constructed by taking a superprocess over some underlying Markov process and conditioning it to stay alive forever. We obtain two representations of such a process. The first representation is in terms of an “immortal particle” that moves around according to the underlying Markov process and throws off pieces of mass, which then proceed to evolve in the same way that mass evolves for the unconditioned superprocess. As a consequence of this representation, we show that the tail σ-field of the conditioned superprocess is trivial if the tail σ-field of the underlying process is trivial. The second representation is analogous to one obtained by LeGall in the unconditioned case. It represents the conditioned superprocess in terms of a certain process taking values in the path space of the underlying process. This representation is useful for studying the “transience” and “recurrence” properties of the closed support process.

Author(s):  
Peter J. Sherman

Health condition monitoring often entails monitoring and detecting changes in the structure of associated random processes. A common trigger for an alarm is when the process amplitude exceeds a specified threshold for a certain period of time. A less common trigger for an alarm is when the process bandwidth changes significantly. This latter type of change occurs, for example, in EEG just prior to the onset of an epileptic seizure [Sherman (2008)]. One can monitor the process directly, or one can convert the process to a 0/1 process where 0 denotes ‘within’ and 1 denotes ‘outside of’ a specified tolerance. Such a process is given many names. One is a binary process, another is a Bernoulli process. If the underlying process is a Gauss-Markov process, then the associated 0/1 process becomes a Markov 0/1 process. The main parameters associated with such a process are the following probabilities: (i), (ii), and (iii). In this work we use the variance expressions for the estimators of these probabilities that were reported in Sherman (2011), in order to detect changes with specified false alarm probabilities. We demonstrate their value in detecting bandwidth change via zero-crossing estimates, and detecting amplitude change via threshold excursion estimates.


1986 ◽  
Vol 23 (01) ◽  
pp. 208-214 ◽  
Author(s):  
Donald R. Fredkin ◽  
John A. Rice

A finite-state Markov process is aggregated into several groups. What can be learned about the underlying process from the aggregated one? We provide some partial answers to this question.


1986 ◽  
Vol 23 (1) ◽  
pp. 208-214 ◽  
Author(s):  
Donald R. Fredkin ◽  
John A. Rice

A finite-state Markov process is aggregated into several groups. What can be learned about the underlying process from the aggregated one? We provide some partial answers to this question.


Author(s):  
UWE FRANZ

We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.


2020 ◽  
Vol 57 (4) ◽  
pp. 1045-1069
Author(s):  
Matija Vidmar

AbstractFor a spectrally negative self-similar Markov process on $[0,\infty)$ with an a.s. finite overall supremum, we provide, in tractable detail, a kind of conditional Wiener–Hopf factorization at the maximum of the absorption time at zero, the conditioning being on the overall supremum and the jump at the overall supremum. In a companion result the Laplace transform of this absorption time (on the event that the process does not go above a given level) is identified under no other assumptions (such as the process admitting a recurrent extension and/or hitting zero continuously), generalizing some existing results in the literature.


1999 ◽  
Vol 36 (01) ◽  
pp. 48-59 ◽  
Author(s):  
George V. Moustakides

Let ξ0,ξ1,ξ2,… be a homogeneous Markov process and let S n denote the partial sum S n = θ(ξ1) + … + θ(ξ n ), where θ(ξ) is a scalar nonlinearity. If N is a stopping time with 𝔼N < ∞ and the Markov process satisfies certain ergodicity properties, we then show that 𝔼S N = [lim n→∞𝔼θ(ξ n )]𝔼N + 𝔼ω(ξ0) − 𝔼ω(ξ N ). The function ω(ξ) is a well defined scalar nonlinearity directly related to θ(ξ) through a Poisson integral equation, with the characteristic that ω(ξ) becomes zero in the i.i.d. case. Consequently our result constitutes an extension to Wald's first lemma for the case of Markov processes. We also show that, when 𝔼N → ∞, the correction term is negligible as compared to 𝔼N in the sense that 𝔼ω(ξ0) − 𝔼ω(ξ N ) = o(𝔼N).


1970 ◽  
Vol 7 (2) ◽  
pp. 400-410 ◽  
Author(s):  
Tore Schweder

Many phenomena studied in the social sciences and elsewhere are complexes of more or less independent characteristics which develop simultaneously. Such phenomena may often be realistically described by time-continuous finite Markov processes. In order to define such a model which will take care of all the relevant a priori information, there ought to be a way of defining a Markov process as a vector of components representing the various characteristics constituting the phenomenon such that the dependences between the characteristics are represented by explicit requirements on the Markov process, preferably on its infinitesimal generator.


1993 ◽  
Vol 6 (4) ◽  
pp. 385-406 ◽  
Author(s):  
N. U. Ahmed ◽  
Xinhong Ding

We consider a nonlinear (in the sense of McKean) Markov process described by a stochastic differential equations in Rd. We prove the existence and uniqueness of invariant measures of such process.


1970 ◽  
Vol 7 (02) ◽  
pp. 400-410 ◽  
Author(s):  
Tore Schweder

Many phenomena studied in the social sciences and elsewhere are complexes of more or less independent characteristics which develop simultaneously. Such phenomena may often be realistically described by time-continuous finite Markov processes. In order to define such a model which will take care of all the relevant a priori information, there ought to be a way of defining a Markov process as a vector of components representing the various characteristics constituting the phenomenon such that the dependences between the characteristics are represented by explicit requirements on the Markov process, preferably on its infinitesimal generator.


1966 ◽  
Vol 3 (1) ◽  
pp. 48-54 ◽  
Author(s):  
William F. Massy

Most empirical work on Markov processes for brand choice has been based on aggregative data. This article explores the validity of the crucial assumption that underlies such analyses, i.e., that all the families in the sample follow a Markov process with the same or similar transition probability matrices. The results show that there is a great deal of diversity among families’ switching processes, and that many of them are of zero rather than first order.


Sign in / Sign up

Export Citation Format

Share Document