Simple Queues
A queue generally has the form of a countable state-space Markov Chain. Here we assume that customers are served in the order they arrive (often referred to as: FIFO – First In First Out). The standard Kendall notation is M/M/C/K. Here the Ms stand for Markov (or memoryless) arrivals, and service times respectively, possibly replaced by G if the process admits more general distributions. C is the number of servers, and K the capacity of the system.
The first example is a M/M/C/C queue, motivated by a telephone network. Here, there are c lines, and an arriving call is lost if all lines are busy at the arrival time. We assume arrivals are a PP() process, and the call times are iid Exp(
) RVs. We record the number of busy lines, which is a continuous-time Markov chain on state-space [0,c]. As usual, we assume the system operates in equilibrium, and so in particular we must have
. It is easy to find the equilibrium distribution. The only non-zero transition probabilities are
and so can that that satisfies the DBEs where
, sometimes called the traffic intensity. This gives
and we define this to be , Erlang’s formula.
Note that if is a MC in equilibrium,
is a MC with the same ED, modulo style of discontinuities (ie whether transitions are left-continuous or right-continuous). Therefore, in any queue where all customers get served, eg an M/M/1 queue, for which
(otherwise the MC is transient, so no ED!), the departure process is the same (in distribution) as the arrivals process.
We can check that this holds for a series of M/M/1 queues, and that in equilibrium, the sizes of the queues are independent. This is merely an extension of the observation that future arrivals for a given queue are independent of the present, and likewise past departures are independent of the future, but the argument is immediately obvious.
Migration Processes
We consider a closed migration process on J colonies, with populations described by a vector . We say that the instantaneous rate of movement of an individual from colony j to colony k is
, for some functions
. We can describe the ED of this system in terms of the distribution obtained when a single individual performs a random walk through the state space, with
taken to be 1 for each j. We call the DBEs for this case the traffic equations, with solutions
. Then, by checking the PBEs, it is clear that the equilibrium distribution of the original migration process satisfies
This is important, as it would have been computationally difficult to solve the original equations for an ED.
The same result holds for an open migration process, where individuals can enter and leave the system, arriving at colony j at rate , and leaving at rate
. Note that this has the same form as if each colony was served by a PP(
), with departures at rate
. But (obviously) this interpretation is not equivalent to the model
The time reversal is also an OMP. One has to check that the transition rates have the correct form, and so the exit process from each colony (in equilibrium, naturally), is PP()$. Most importantly, given a communicating class, we can think of the restriction to this class as an OMP, so in particular, rates of transition between classes are Poisson.
Little’s Law
Informally, the mean number of customers should be equal to the arrival rate multiplied by the mean sojourn time in the system of a customer. This is easiest to formalise by taking an expectation up to a regeneration time. This is T, the first time the system returns to its original state (assumed to be 0 customers), an a.s. finite stopping time.
Set , the average number of customers, and
where N is the number of customers arriving in [0,T], and
is the waiting time of the n-th customer.
Little’s Law asserts that . Note that for a Markov Chain in equilibrium, can define L more simply as
and similarly for W.
It is easiest proved by considering the area between the arrivals process and the departure process in two ways: integrating over height and width. Note that working up to a regeneration time is convenient because at that time the processes are equal.
The migration processes above are said to be linear if . This process has the form of a Markov chain, and so even in a more general state space, the distribution of points in disjoint subsets of the state-space are independent Poisson processes.
Often though, we start with no individuals in the system, but still the distribution is given by a time-inhomogenous Poisson random measure. The mean is specified by
where is the net arrival rate, and
is the probability that an individual is in E, a time interval of u after arriving.
As one would suspect, this is easiest to check through generating functions, since independence has a straightforward generating function analogue, and the expression for a Poisson RV is manageable.
Related articles
- One queue or many queues? (eventuallyalmosteverywhere.wordpress.com)
Pingback: Loss Networks and Erlang’s Fixed Point | Eventually Almost Everywhere