0
$\begingroup$

Let $(X_n)_{n\in\mathbb N}$ be a sequence of independent random variables with the same distribution. The common distribution $\mu$ is such that it is symmetric, that is, $\mu((-\infty,x])=\mu([-x,\infty))$ for every $x\in\mathbb R$, and it has no first moment, that is, $$\int_{-\infty}^{\infty}|x|\,\mathrm d\mu(x)=\infty.$$

Are there any well-known results as to whether $$\frac{X_1+\cdots+X_n}{n}$$ converges in law as $n\to\infty$? My preliminary experimentation (and the fact that the average of Cauchy random variables is Cauchy) seems to me to suggest the possibility that the answer may be positive and the limit distribution is Cauchy, but I’m not sure. Any thoughts would be appreciated.


UPDATE: Let me refine my conjecture—my experimentation suggests the refined version holds. Suppose, in addition, that the support of $\mu$ consists of countably infinitely many points on the real line: $$\cdots\qquad-x_3\qquad-x_2\qquad-x_1\qquad0\qquad x_1\qquad x_2\qquad x_3\qquad\cdots$$ where $0<x_1<x_2<x_3<\cdots$ are distinct positive numbers. The probability of the point $\{0\}$ is $p_0\geq0$ and the probability of the points $\{x_k\}$ and $\{-x_k\}$ is $p_k/2\geq 0$ each for every $k\in\mathbb N$. Of course, $p_0+\sum_{k=1}^{\infty}p_k=1$ and, since the first moment does not exist, $$\sum_{k=1}^{\infty}p_kx_k=\infty.\tag{$\star$}$$ Now, the characteristic function of the distribution is $$t\mapsto p_0+\sum_{k=1}^{\infty}p_k\cos(tx_k),$$ so that the characteristic function of the sample mean for any $n\in\mathbb N$ is $$t\mapsto \left[p_0+\sum_{k=1}^{\infty}p_k\cos\left(\frac{t}{n}x_k\right)\right]^n.$$

Under ($\star$), does there exist some $\gamma>0$ such that this expression converges to $$\exp(-\gamma|t|)$$ for every $t\in\mathbb R$ as $n\to\infty$?

$\endgroup$
5
  • 1
    $\begingroup$ If you want the denominator to be $n$, then this is Law of large numbers, not CLT. There are many posts on MSE about the infinite mean case. $\endgroup$ Commented yesterday
  • $\begingroup$ @KaviRamaMurthy Fair point, although I decided to use the CLT analogy because I expect convergence in law, not almost sure convergence as with LLN. $\endgroup$ Commented yesterday
  • $\begingroup$ If such a convergence in law holds, then the limit must be a stable random variable. If for example $t^\alpha \mu((t,\infty)) \to c \in (0,\infty)$ for some $\alpha \in (0,1)$ (note that then $\mu$ has no first moment), we would get $n^{-\frac{1}{\alpha}}(X_1+...+X_n) \to Z$, where $Z$ is some non-degenerate stable random variable. In particular, $n^{-1}(X_1+...+X_n) = n^{\frac{1}{\alpha}-1} \cdot n^{-\frac{1}{\alpha}}(X_1+...+X_n)$ cannot converge in law to any non-degenerate random variable. What you search for is a generalized version of CTG for stable distributions. $\endgroup$ Commented yesterday
  • 1
    $\begingroup$ but with stable distributions with heavier tails than a Cauchy distribution, the distribution of $\frac{X_1+\cdots+X_n}{n}$ is not stable as $n$ increases and you have to divide by something growing faster than $n$. $\endgroup$ Commented yesterday
  • $\begingroup$ Thank you all for your comments! What if the support of $\mu$ consists entirely of integers (in particular, the support is countable and discrete)? My experimentation suggests that the characteristic function of the mean does converge pointwise to a function of the form $t\mapsto\exp(-\gamma|t|)$ as $n\to\infty$ for some $\gamma>0$, which means the limiting law is Cauchy. $\endgroup$ Commented yesterday

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.