Suppose $X_n$ and $Y_n$ are sequences of random variables, defined on a common probability space, and such that $X_n-Y_n \to C$ converges in distribution to the constant r.v. $C\equiv c$ as $n \to \infty$ (and also $Y_n \to Y$). Does it then hold, without making any further assumptions on the r.v.'s, that $X_n \to C+Y$ in distr.?
If not, what are the minimum requirements on $X_n$ and $Y_n$ for this to be true?
What can be said about the case, where $C$ is replaced by an arbitrary random variable?
$\endgroup$ 12 Answers
$\begingroup$Lemma 1. If $\{A_n\}$ is a sequence of random variables converges in distribution to a constant $c$, then it converges in probability to $c$.
Fix $\varepsilon>0$, $f$ a function such that $f(x)=0$ if $|x-c|\geqslant 2\varepsilon$, $f(t)=1$ if $|t-c|\leq\varepsilon$ and $f$ is piecewise linear. It's a bounded continuous function, so $$\int f(A_n)dP\to 1.$$ As $$\int f(A_n)dP\leqslant P(|A_n-c|\leqslant\varepsilon)+\varepsilon,$$ we have $$P(|A_n-c|>\varepsilon)\leqslant 1-\int f(A_n)dP+\varepsilon,$$ This proves convergence in probability.
Lemma 2. If $\{X_n\}$ converges in distribution to $X$, and $\{Y_n\}$ in probability to $c$, where $c$ is constant, then $\{X_n+Y_n\}$ converges in distribution to $X+c$.
Indeed, by portmanteau theorem, it's enough to check that $\int f(X_n+Y_n)dP\to \int f(X+Y)dP$ for all $f$ uniformly continuous and bounded. If $\varepsilon>0$ and $\delta$ as in the definition of uniform continuity, we have $$\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \sup |f|\cdot P(|Y_n-c|\geqslant \delta)+\varepsilon+\left|\int (f(X_n+c)-f(X+c))dP\right|.$$ As $f(c+\cdot)$ is continuous and bounded, we have $$\limsup_{n\to +\infty}\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \varepsilon,$$ proving convergence in law of $\{X_n+Y_n\}$ to $\{X_n+c\}$.
$\endgroup$ 4 $\begingroup$EDIT: My previous claim that this is true when $C$ is constant wasn't completely thought out.
If $C$ is not constant, this is false. Basically, we can replace $C$ by any random variable with the same distribution and preserve the truth of $X_n - Y_n \to C$, but this may change the distribution of $C+Y$. For example, let $Y,Z$ be iid with any nonconstant distribution (coin flips will work), and set $X_n = 0$, $Y_n = Y$, $C = -Z$. We have $X_n - Y_n = -Y \to C$ in distribution, but $C+Y$ has a nontrivial distribution so we don't have $X_n \to C+Y$.
$\endgroup$ 3