Markov Property
We go way back to the Part 1B short course Markov Chains. In the first lecture of this course, we met discrete time Markov Chains. A definition was given, in terms of conditional single period transition probabilities, and it was immediately proved that general transition probabilities are specified by sums of products of entries in a so-called transition matrix. This proves very useful for performing calculations. But the question will inevitably be asked: “Prove that this is a Markov process.” And the answer “because it obviously is” isn’t good enough.
The point is that all of the above is relevant to the setting, but the ideal definition of a Markov process is something like the very general statement:
Conditional on the present, the past and the future are independent.
This opens up the possibility of a large class of processes being Markov processes. A technical definition would be that for s<t and A a measurable subset of the state space.
.
It is easy to check that this is equivalent to the original definition in the that context.
Strong Markov Property
SMP states that given a stopping time T, conditional on the event :
,
that is, the process started at time T has the same distribution as the original process started from 0 (in space as well as time). Furthermore, it is independent of , which requires technical definition, but which informally is the sigma field of events defined by what happens up to time T.
For a discrete time Markov chain, prove SMP by pre-multiplying by the indicator function , which reduces SMP to the normal Markov property. Then take the countable sum over n (which is permissible) to get SMP. For Brownian Motion in one dimension, make dyadic approximations to the stopping time from above. SMP applies to these approximations, and measure theoretic machinery and the (almost sure) continuity of paths allows the equivalence of distributions to hold in the limit. Independence follows by expressing
as the intersection of sigma fields corresponding to the approximations.
In both of these cases, an implicit countable substructure (discrete time and continuity respectively) have been required to deduce SMP. This suggests that there are Markov processes which do not have SMP.
Motivating Counterexample
Take B to be a Brownian Motion in one dimension, with a RV which contains 0 in its support. Now define the the process:
.
Then X is certainly not Strong Markov, by considering the hitting time of 0. Then the process started there is either identically 0, or a standard BM, but which is determined by time 0 properties rather than time T properties.
But X is Markov. Take s<t and A Borel. Then:
Now a.s. so
, which is precisely the Markov property.
Feller Property
In general, it hard to verify the Strong Markov Property for a given stochastic process. Instead, we consider a property which is stronger, but easier to check.
Given a Markov process X(x,t) with initial distribution x. X has the Feller property if for all t,
whenever then
.
In other words, the distribution of the process at time t is dependent continuously on the initial distribution under a suitable topology. We also demand that
As ,
which means that starting from any point, paths are cadlag. (Remember that convergence in probability to a constant is the same as almost sure convergence)
Examples
1) The discrete time, finite state space Markov Chain as initially discussed has the Feller property. Time m distributions are given by repeating applications of the transition matrix:
.
But P is a finite-dimensional linear map, and so is certainly continuous.
2) Brownian Motion can be described as the sum of a standard BM and a random variable describing the starting point. The standard BM disappears from the criterion for the Feller property as convergence in distribution is preserved under addition, and the remaining RV is time-independent so has the property.
Feller implies Strong Markov
Define the transition operator:
.
The Feller property say that for all bounded measurable f, this is continuous as a function of x, including when x=Z a random variable. P must also be right-continuous as a function of t at 0. We want to prove that for Z bounded -measurable:
.
This holds for , discrete approximations to T from above by the discrete time SMP result. Then want:
.
The first equality follows by right-continuity and bounded convergence, and the final equality by right-continuity and the Feller property.
Feller = Strong Markov?
A final remark worth making is that the Feller property is genuinely stronger than SMP. Consider a process defined for times with some initial distribution and evolution given by:
when
and
otherwise.
This is not Feller because if the initial state moves from being fixed and negative to fixed and zero, the (deterministic) distribution at time 1 has a discontinuity. But it is Strong Markov.
However, it is reasonable to say that this example is quite contrived. In general, we aim to prove a process is Feller in order to use the Strong Markov Property.
References
Dynkin, Yushkevich – Strong Markov Processes (1955)
Breiman – Probability (Chapter 16)
Pitman – Notes, in particular the Markov but not Strong Markov counterexample.
http://almostsure.wordpress.com/2010/07/19/properties-of-feller-processes/
Pingback: Strong Markov Property for BM | Eventually Almost Everywhere
Pingback: Dominic Yeo
Pingback: Reversing Markov Chains | Eventually Almost Everywhere
A nice blog post. Thank you!
By the way, there exists a small typo in the second display of the subsection “Feller implies Strong Markov”, if you want to clean it out.