**Markov Property**

We go way back to the Part 1B short course *Markov Chains*. In the first lecture of this course, we met discrete time Markov Chains. A definition was given, in terms of conditional single period transition probabilities, and it was immediately proved that general transition probabilities are specified by sums of products of entries in a so-called transition matrix. This proves very useful for performing calculations. But the question will inevitably be asked: “Prove that this is a Markov process.” And the answer “because it obviously is” isn’t good enough.

The point is that all of the above is relevant to the setting, but the ideal definition of a Markov process is something like the very general statement:

Conditional on the present, the past and the future are independent.

This opens up the possibility of a large class of processes being Markov processes. A technical definition would be that for *s<t* and *A *a measurable subset of the state space.

.

It is easy to check that this is equivalent to the original definition in the that context.

**Strong Markov Property**

SMP states that given a stopping time *T*, conditional on the event :

,

that is, the process started at time *T* has the same distribution as the original process started from 0 (in space as well as time). Furthermore, it is independent of , which requires technical definition, but which informally is the sigma field of events defined by what happens up to time *T*.

For a discrete time Markov chain, prove SMP by pre-multiplying by the indicator function , which reduces SMP to the normal Markov property. Then take the countable sum over *n *(which is permissible) to get SMP. For Brownian Motion in one dimension, make dyadic approximations to the stopping time from above. SMP applies to these approximations, and measure theoretic machinery and the (almost sure) continuity of paths allows the equivalence of distributions to hold in the limit. Independence follows by expressing as the intersection of sigma fields corresponding to the approximations.

In both of these cases, an implicit countable substructure (discrete time and continuity respectively) have been required to deduce SMP. This suggests that there are Markov processes which do not have SMP.

**Motivating Counterexample**

Take *B* to be a Brownian Motion in one dimension, with a RV which contains 0 in its support. Now define the the process:

.

Then *X* is certainly not Strong Markov, by considering the hitting time of 0. Then the process started there is either identically 0, or a standard BM, but which is determined by time 0 properties rather than time T properties.

But *X *is Markov. Take *s<t* and *A* Borel. Then:

Now a.s. so

, which is precisely the Markov property.

**Feller Property**

In general, it hard to verify the Strong Markov Property for a given stochastic process. Instead, we consider a property which is stronger, but easier to check. Continue reading