Delayed Connectivity in Random Graphs

Aside

I presented a poster at the Oxford SIAM Student Chapter Conference on Friday. It was nice to win the prize for best poster, but mainly I enjoyed putting it together. For once it meant ignoring the technical details and anything requiring displayed formulae, and focusing only on aspects that could be conveyed with bullet points and images. Anyway, this is what I came up with. The real thing is sitting safely in a tube in my office, ready for the next time it is needed in a hurry!

Delayed Connectivity Poster

Advertisements

DBEs and stationary distributions

Aside

The most recent Applied Probability assignment sheet featured various aspects of Detailed Balance Equations for continuous-time Markov chains. We discussed the advantages and disadvantages of using DBEs rather than solving for an equilibrium distribution directly. The equations used in this second case are often called Full Balance Equations.

Briefly, the advantages of DBEs are that they are easy to solve. After all, each one contains only two components of the equilibrium distribution, so generally you can solve one-at-a-time. The disadvantage is that an equilibrium distribution might not satisfy the DBEs. The deductive structure is:

\text{Solves DBEs}\quad \stackrel{\Rightarrow}{\not\Leftarrow}\quad\text{Equilibrium distribution}

Usually, the chain will be irreducible, so the equilibrium distribution is unique. This means that if we can solve the DBEs, the result is the unique equilibrium distribution.

The DBEs are soluble only if the situation is reversible. This is probably the best definition to use in practice, but informally we can say that this means that the behaviour looks qualitatively the same if we reverse time. For example, as in Q1:

Q=\begin{pmatrix}-1 &1&0\\ 0& -1&1\\1&0&-1\end{pmatrix},

gives the Q-matrix which equilibrium distribution (\frac13,\frac13,\frac13), which does not satisfy DBEs. The chain is not reversible because sample paths always go clockwise, so if we reversed time they would go anti-clockwise (or vice-versa depending on how you’ve drawn the diagram).

What I wanted to say in the class, and made a mess of explaining was this, about why it was inappropriate to use DBEs to find stationary distributions in Q3d):

Reversibility is not just a function of the chain. It is a function of the chain AND the initial distribution. This is only in practice a concern when the chain is reducible, but in this case it really can lead you astray. Let’s consider an example, like

Q=\begin{pmatrix}-3&2&0&0&1&0\\ 0&-4&3&1&0&0\\ 0&1&-4&3&0&0\\ 0&3&1&-4&0&0\\ 0&0&0&0&-5&5\\ 0&0&0&0&5&-5\end{pmatrix}.

Then by solving as in the problem sheet, the invariant distributions are given by:

\lambda(0,\frac13,\frac13,\frac13,0,0)+\mu(0,0,0,0,\frac12\frac12),\quad \lambda+\mu=1.

If you attempted to solve the DBEs, you would succeed, but the only solution would be

(0,0,0,0,\frac12,\frac12).

The explanation is fairly simple in the end. Reversibility is a class property, and only one of the communicating classes, \{5,6\} in this example admits a reversible initial distribution, so to solve the DBEs we must assign zero mass on the other class.

Anyway, I hope that clears up any residual confusion from the class.

An obvious remark

Aside

An obvious remark:

If a sequence of independent random variables X_n converge almost surely to some limit X, this limit must be a constant (almost surely).

I’ve been thinking about the Central Limit Theorem about related Large Deviations results this afternoon, and wasted almost an hour worrying about situations which were effectively well-disguised special cases of the above.

Why is it true? Well, suppose each X_n is \mathcal{F}_n-measurable. But by independence, we might as well take \mathcal{F}_n=\sigma(X_n). Then the limit variable X is independent of \mathcal{F}_n for all n, and thus independent of \cup F_n=\mathcal{F}\supset \sigma(X). If X is independent of itself, it must be almost surely constant.

Advanced Probability Revision Summary

Aside

My plan to write a post a day about a likely piece of Advanced Probability bookwork has been more useful than even I anticipated. It’s surprising how often you come to write something down and realise you don’t understand half the steps. Anyway, here’s a list, in approximately chronological order through the course, of what I’ve covered:

Missing in Action?

Aside

Not much maths has been appearing here in the past few weeks. But I have been working…

As part of Part III, instead of sitting an extra exam paper I am writing an essay. I have chosen the topic of ‘Multiplicative Coalescence’ so have been hard at work reading various papers and articles. I’ve been writing some posts about the topic as practice for writing up the essay – in fact, it’s entirely possible that large chunks will end up featuring verbatim. As a result, to ensure I stay firmly on the correct side of the rules about plagiarism, I’m going to wait until after exam results are announced on June 20th before making these posts visible.