I need a timer that fires every 25ms. I've been comparing the default Timer implementation between Windows 10 and Linux (Ubuntu Server 16.10 and 12.04) on both the dotnet core runtime and the latest mono-runtime.
There are some differences in the timer precision that I don't quite understand.
I'm using the following piece of code to test the Timer:
// inside Main()
var s = new Stopwatch();
var offsets = new List<long>();
const int interval = 25;
using (var t = new Timer((obj) =>
{
offsets.Add(s.ElapsedMilliseconds);
s.Restart();
}, null, 0, interval))
{
s.Start();
Thread.Sleep(5000);
}
foreach(var n in offsets)
{
Console.WriteLine(n);
}
Console.WriteLine(offsets.Average(n => Math.Abs(interval - n)));
On windows it's all over the place:
...
36
25
36
26
36
5,8875 # <-- average timing error
Using dotnet core on linux, it's less all over the place:
...
25
30
27
28
27
2.59776536312849 # <-- average timing error
But the mono Timer is very precise:
...
25
25
24
25
25
25
0.33 # <-- average timing error
Edit: Even on windows, mono still maintains its timing precision:
...
25
25
25
25
25
25
25
24
0.31
What is causing this difference? Is there a benefit to the way the dotnet core runtime does things compared to mono, that justifies the lost precision?
Timerclass on Windows, in the CLR (without using mono), and you'll get the same precision. The issue here is just the STL implementation.