A transition matrix $P$ is said to be doubly stochastic if the sum over each column equals one, that is $\sum_i P_{ij}=1\space\forall i$. If such a chain is irreducible and aperiodic and consists of $M+1$ states $0,1,\dots,M$ show that the limiting probabilities are given by $$\pi_j=\frac{1}{1+M},j=0,1,\dots,M$$
I have no idea how to prove it but
If a chain is irreducible then all states communicate i.e $$P_{ij}>0\space \text{and}\space P_{ji}>0\space\forall i,j$$
If $d$ denotes the period of any state, if a chain is irreducible aperiodic, then $d(i)=1\forall i$
If $P_{(M+1)\times (M+1)}$ matrix and $\pi$ is the stationary distribution $$\pi_j=\sum_iP_{ij}\pi_i\space j=0,1,\dots,M+1$$
but how I can get this expression?
$\endgroup$2 Answers
$\begingroup$Proof:
We first must note that $\pi_j$ is the unique solution to $\pi_j=\sum \limits_{i=0} \pi_i P_{ij}$ and $\sum \limits_{i=0}\pi_i=1$.
Let's use $\pi_i=1$. From the double stochastic nature of the matrix, we have $$\pi_j=\sum_{i=0}^M \pi_iP_{ij}=\sum_{i=0}^M P_{ij}=1$$ Hence, $\pi_i=1$ is a valid solution to the first set of equations, and to make it a solution to the second we must normalize it by dividing by $M+1$.
Then by uniqueness as mentioned above, $\pi_j=\dfrac{1}{M+1}$. $$ \blacksquare$$
Note : To understand this proof, one must recall the definition of a stationary distribution.
A vector $\mathbf{\pi}$ is called a stationary distribution vector of a Markov process if the elements of $\mathbf{\pi}$ satisfy: $$ \mathbf{\pi} = \mathbf{\pi} \cdot \mathbf{P}, \sum_{i \in S} \pi_{i} = 1 \text{ , and } \pi_{i} > 0\text{ }\forall \text{ } i \in S $$ Note that a stationary distribution may not exist, and may not be unique.
$\endgroup$ 5 $\begingroup$We first must note that $\pi_j$ is the unique solution to $\pi_j=\sum \limits_{i=0}^M \pi_i P_{ij}$ and $\sum \limits_{i=0}\pi_i=1$ ...$(\star)$
Again, since doubly stochastic, $\sum \limits_{i=0}^M P_{ij}=1$.
From, $\pi_j=\sum \limits_{i=0}^M \pi_i P_{ij}$ and $\sum \limits_{i=0}^M P_{ij}=1$, we can get, $1-\pi_j=\sum \limits_{i=0}^M (1-\pi_i) P_{ij}$ or, $\dfrac{1-\pi_j}{M}=\sum \limits_{i=0}^M \dfrac{(1-\pi_i)}{M} P_{ij}$. This implies $\Big( \dfrac{1-\pi_0^*}{M},\dfrac{1-\pi_1^*}{M},\dots,\dfrac{1-\pi_M^*}{M}\Big)$ is also a solution if $(\pi_0^*,\pi_1^*,\dots, \pi_M^*)$ is a solution of $(\star)$. Because of uniqueness, we get $$\dfrac{1-\pi_j}{M}=\pi_j\Rightarrow \pi_j=\dfrac{1}{M+1},\forall j$$
$\endgroup$