I have the following pandas.core.series.Series:
x=pd.Series([17.39, 8.70, 2.90, 1.45, 1.45]) for t=0,1,2,3,4.
When I try x.ewm(span=2).mean() I get the following results:
17.39, 10.87, 5.35, 2.71, 1.86
My understanding is that .ewm.mean() uses the following explicit formula:
y[t] = (1 - alpha) * y[t-1] + alpha * x[t], where alpha = 2/(span+1)
y[0] = x[0]
Using the above formula:
EMWA[0] = x[0] = 17.39
EMWA[1] = (1-(2/(2+1)))*17.39 + (2/(2+1))*8.7 = 11.59 which is different to 10.87.
EMWA[2] = (1-(2/(2+1)))*10.87 + (2/(2+1))*2.9 = 5.55 which is different to 5.35.
EMWA[3] = (1-(2/(2+1)))*5.35 + (2/(2+1))*1.45 = 2.75 which is different to 2.71. etc..
Could you please help me understand where these differences coming from? What I am missing or doing wrong here? Thank you.
TTR::EMA(x,n=2)which simply computes the first non-missing value as the arithmetic mean of the firstnvariables and then recursively usey[t] = (1 - alpha) * y[t-1] + alpha * x[t](for t>n), and the wayx.ewm(span=2, adjust = False).mean()uses they[0]=x[0]and then recursivelyy[t] = (1 - alpha) * y[t-1] + alpha * x[t](for t>=1). stackoverflow.com/questions/12557697/…