Consider $X_m$ which are independent, identically distributed random variables which have moments exactly up to order $2$ and no higher. This can be done in numerous ways. One is the following: let $a_n$ be a sequence of positive numbers that goes to zero and let $b_n$ be a sequence of positive numbers such that $\sum_{n=1}^\infty \frac{b_n}{a_n}$ converges. (For example, $a_n=1/n,b_n=2^{-n}$ does the job.) Let $f_n(x)=(2+a_n) 1_{[1,\infty)}(x) |x|^{-3-a_n}$. Then let $f(x)=\sum_{n=1}^\infty \frac{b_n}{\sum_{k=1}^\infty b_k} f_n(x)$. Then $f$ is a pdf of a r.v. which has no higher moments than the second (since each $f_n$ has exactly moments of order strictly less than $2+a_n$). The assumption about $b_n$ ensures that it actually does have a finite second moment. (In effect the trick here is $\bigcap_{n=1}^\infty [1,2+a_n)=[1,2]$.)
Despite this somewhat pathological moment property, the classical central limit theorem still tells us that $\frac{\overline{X}_m-\mu}{\sigma/\sqrt{m}}$ converges in distribution to a $N(0,1)$ random variable. However, the most common quantitative estimates for the convergence rate, e.g. the Berry-Esseen theorem, require the existence of a moment higher than order $2$. What can be said about the convergence rate in cases like the one in the previous paragraph?