I wish to use the Computational formula of the variance to calculate the variance of a normal-distributed function. For this, I need the expected value of $X$ as well as the one of $X^2$. Intuitively, I would have assumed that $E(X^2)$ is always equal to $E(X)^2$. In fact, I cannot imagine how they could be different.
Could you explain how this is possible, e.g. with an example?
$\endgroup$ 69 Answers
$\begingroup$Assume $X$ is a random variable that is 0 half the time and 1 half the time. Then $$EX = 0.5 \times 0 + 0.5 \times 1 = 0.5$$ so that $$(EX)^2 = 0.25,$$ whereas on the other hand $$E(X^2) = 0.5 \times 0^2 + 0.5 \times 1^2 = 0.5.$$ By the way, since $Var(X) = E[(X - \mu)^2] = \sum_x (x - \mu)^2 P(x)$, the only way the variance could ever be 0 in the discrete case is when $X$ is constant.
$\endgroup$ $\begingroup$Let $EX=\mu$ and $E(X-\mu)^2=\sigma^2$, then
$$ EX^2 = E[X-\mu+\mu]^2=\\ =E(X-\mu)^2+2E[(X-\mu)\mu]+E(\mu^2)=\\=\sigma^2+2\mu E(X-\mu)+\mu^2=\\ =\sigma^2+\mu^2 $$
So $EX^2 =\sigma^2+\mu^2$, no matter the distribution, and $EX^2\ne(EX)^2$ unless the variance equals zero.
$\endgroup$ 2 $\begingroup$One is an average of squares, the other a square of an average. In general, when you reverse two procedures (mix cookies, bake cookies), you have no right to expect the same outcome.
$\endgroup$ $\begingroup$Note that your logic applied to a uniform distribution would give that $$(x_1+x_2+\cdots+x_n)^2=n({x_1}^2+{x_2}^2+\cdots+{x_n}^2)$$ which is clearly not true in general.
$\endgroup$ 7 $\begingroup$Let us take for example $X$ the standard normal, or any normal with mean $0$. Then $E(X)=0$.
But $X^2$ is always positive, so clearly its mean must be positive.
This shows that (in this case) $E(X^2)\ne (E(X))^2$.
In fact, when the expectations exist, $E(X^2)>(E(X))^2$ except when $X$ is constant with probability $1$.
$\endgroup$ 2 $\begingroup$Say you have a fair coin that says $X=1$ on one side and $X=3$ on the other side. You flip the coin. Clearly, $E(X)=\frac12(1+3) = 2$.
If you are counting $X^2$ instead of $X$, then one side of the coin is worth $1^2=1$ and the other side is worth $3^2=9$, so $E(X^2) = \frac12(1+9)=5$.
$5\ne 2^2$.
$\endgroup$ $\begingroup$My turn:
Let $X$ be uniformly distributed on $[0,1]$. The $E X =\int_{t=0}^1 t dt = \frac{1}{2}$, but $E X^2 =\int_{t=0}^1 t^2 dt = \frac{1}{3}$.
$\endgroup$ $\begingroup$May as well chime in :)
Expectations are linear pretty much by definition, so $E(aX + b) = aE(X) + b$. Also linear is the function $f(x) = ax$. If we take a look at $f(x^2)$, we get
$f(x^2) = a(x^2) \not= (ax)^2 = f(x)^2$.
If $E(X^2) = E(X)^2$, then $E(X)$ could not be linear, which is a contradiction of its definition. So, it's not true :)
$\endgroup$ $\begingroup$Assuming $X$ is a discrete random variable $E(X)=\sum x_ip_i$. Therefore $E(X^2)=\sum x_i^2p_i$ while $[E(X)]^2=\left(\sum x_ip_i\right)^2$. Now, as Robert Mastragostino says, this would imply that $(x+y+z+\cdots)^2=x^2+y^2+z^2+\cdots$ which is not true unless $X$ is constant.
$\endgroup$ 1