What is the mutual information $I(X;X)$?

$\begingroup$

$X$ is a random variable with normal distribution, assume $Y=X$, what is the mutual information $I(X;Y)$?

I guess that $h(Y|X)=0$ since when $X$ is known, $Y$ is completely known, so $$I(X;Y)=h(Y)-h(Y|X)=h(Y)=\frac{1}{2}\log 2\pi e\sigma^2$$ nat.

But, I was told I was wrong! and a numerical computation also shows that the value of $$I(X;Y) \neq \frac{1}{2}\log 2\pi e\sigma^2$$ Where is my mistake? Please help me out of this problem, thanks a lot! (Please note that $X$ and $Y$ are both continuous).

$\endgroup$

2 Answers

$\begingroup$

For any r.v. $X$

$$I(X,X)=H(X).$$

To see this, put $Y=X$ in $I(X,Y)=H(X)-H(X|Y)$ and use

$$H(X|X)=0.~~ (*)$$

In summary, the mutual information of $X$ with itself is just its self information $H(X)$, as $H(X|X)$, i.e. the residual average information carried by $X$ conditional $X$ is zero, i.e. $(*)$ holds.

$\endgroup$ 1 $\begingroup$

If $(X_1,X_2)$ is a Gaussian vector each variable having the same variance ($\sigma^2$) with covariance matrix $K$, then $$h(X_1,X_2)=\frac{1}{2}\log((2\pi e)^2|K|)=\frac{1}{2}\log((2\pi e)^2 \sigma^4(1-\rho^2))$$ where $\rho$ is the correlation coefficient.

Now $I(X_1:X_2)=2h(X_1)-h(X_1,X_2)=-\frac{1}{2}\log(1-\rho^2)$.

When $X_1=X_2$, $\rho=1$ and hence $I(X:X)=\infty$.

$\endgroup$ 3

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like