Let $(X_n)_{n\in\mathbb N}$ be a sequence of independent random variables with the same distribution. The common distribution $\mu$ is such that it is symmetric, that is, $\mu((-\infty,x])=\mu([-x,\infty))$ for every $x\in\mathbb R$, and it has no first moment, that is, $$\int_{-\infty}^{\infty}|x|\,\mathrm d\mu(x)=\infty.$$
Are there any well-known results as to whether $$\frac{X_1+\cdots+X_n}{n}$$ converges in law as $n\to\infty$? My preliminary experimentation (and the fact that the average of Cauchy random variables is Cauchy) seems to me to suggest the possibility that the answer may be positive and the limit distribution is Cauchy, but I’m not sure. Any thoughts would be appreciated.
UPDATE: Let me refine my conjecture—my experimentation suggests the refined version holds. Suppose, in addition, that the support of $\mu$ consists of countably infinitely many points on the real line: $$\cdots\qquad-x_3\qquad-x_2\qquad-x_1\qquad0\qquad x_1\qquad x_2\qquad x_3\qquad\cdots$$ where $0<x_1<x_2<x_3<\cdots$ are distinct positive numbers. The probability of the point $\{0\}$ is $p_0\geq0$ and the probability of the points $\{x_k\}$ and $\{-x_k\}$ is $p_k/2\geq 0$ each for every $k\in\mathbb N$. Of course, $p_0+\sum_{k=1}^{\infty}p_k=1$ and, since the first moment does not exist, $$\sum_{k=1}^{\infty}p_kx_k=\infty.\tag{$\star$}$$ Now, the characteristic function of the distribution is $$t\mapsto p_0+\sum_{k=1}^{\infty}p_k\cos(tx_k),$$ so that the characteristic function of the sample mean for any $n\in\mathbb N$ is $$t\mapsto \left[p_0+\sum_{k=1}^{\infty}p_k\cos\left(\frac{t}{n}x_k\right)\right]^n.$$
Under ($\star$), does there exist some $\gamma>0$ such that this expression converges to $$\exp(-\gamma|t|)$$ for every $t\in\mathbb R$ as $n\to\infty$?