9

I need a timer that fires every 25ms. I've been comparing the default Timer implementation between Windows 10 and Linux (Ubuntu Server 16.10 and 12.04) on both the dotnet core runtime and the latest mono-runtime.

There are some differences in the timer precision that I don't quite understand.

I'm using the following piece of code to test the Timer:

// inside Main()
        var s = new Stopwatch();
        var offsets = new List<long>();

        const int interval = 25;
        using (var t = new Timer((obj) =>
        {
            offsets.Add(s.ElapsedMilliseconds);
            s.Restart();
        }, null, 0, interval))
        {
            s.Start();
            Thread.Sleep(5000);
        }

        foreach(var n in offsets)
        {
            Console.WriteLine(n);
        }

        Console.WriteLine(offsets.Average(n => Math.Abs(interval - n)));

On windows it's all over the place:

...
36
25
36
26
36
5,8875 # <-- average timing error

Using dotnet core on linux, it's less all over the place:

...
25
30
27
28
27
2.59776536312849 # <-- average timing error

But the mono Timer is very precise:

...
25
25
24
25
25
25
0.33 # <-- average timing error

Edit: Even on windows, mono still maintains its timing precision:

...
25
25
25
25
25
25
25
24
0.31

What is causing this difference? Is there a benefit to the way the dotnet core runtime does things compared to mono, that justifies the lost precision?

2
  • 1
    2017 still no actual solution. Why don't someone wrap up native c++ multimedia timer in each platform. Don't know if there're any problem? Commented Sep 24, 2017 at 14:52
  • @b.ben I believe it's actually possible to use the mono implementation of the Timer class on Windows, in the CLR (without using mono), and you'll get the same precision. The issue here is just the STL implementation. Commented Sep 24, 2017 at 18:36

1 Answer 1

8

Unfortunately you cannot rely on timers in the .NET framework. The best one has 15 ms frequency even if you want to trigger it in every millisecond. But you can implement a high-resolution timer with microsec precision, too.

Note: This works only when Stopwatch.IsHighResolution returns true. In Windows this is true starting with Windows XP; however, I did not test other frameworks.

public class HiResTimer
{
    // The number of ticks per one millisecond.
    private static readonly float tickFrequency = 1000f / Stopwatch.Frequency;

    public event EventHandler<HiResTimerElapsedEventArgs> Elapsed;

    private volatile float interval;
    private volatile bool isRunning;

    public HiResTimer() : this(1f)
    {
    }

    public HiResTimer(float interval)
    {
        if (interval < 0f || Single.IsNaN(interval))
            throw new ArgumentOutOfRangeException(nameof(interval));
        this.interval = interval;
    }

    // The interval in milliseconds. Fractions are allowed so 0.001 is one microsecond.
    public float Interval
    {
        get { return interval; }
        set
        {
            if (value < 0f || Single.IsNaN(value))
                throw new ArgumentOutOfRangeException(nameof(value));
            interval = value;
        }
    }

    public bool Enabled
    {
        set
        {
            if (value)
                Start();
            else
                Stop();
        }
        get { return isRunning; }
    }

    public void Start()
    {
        if (isRunning)
            return;

        isRunning = true;
        Thread thread = new Thread(ExecuteTimer);
        thread.Priority = ThreadPriority.Highest;
        thread.Start();
    }

    public void Stop()
    {
        isRunning = false;
    }

    private void ExecuteTimer()
    {
        float nextTrigger = 0f;

        Stopwatch stopwatch = new Stopwatch();
        stopwatch.Start();

        while (isRunning)
        {
            float intervalLocal = interval;
            nextTrigger += intervalLocal;
            float elapsed;

            while (true)
            {
                elapsed = ElapsedHiRes(stopwatch);
                float diff = nextTrigger - elapsed;
                if (diff <= 0f)
                    break;

                if (diff < 1f)
                    Thread.SpinWait(10);
                else if (diff < 10f)
                    Thread.SpinWait(100);
                else
                {
                    // By default Sleep(1) lasts about 15.5 ms (if not configured otherwise for the application by WinMM, for example)
                    // so not allowing sleeping under 16 ms. Not sleeping for more than 50 ms so interval changes/stopping can be detected.
                    if (diff >= 16f)
                        Thread.Sleep(diff >= 100f ? 50 : 1);
                    else
                    {
                        Thread.SpinWait(1000);
                        Thread.Sleep(0);
                    }

                    // if we have a larger time to wait, we check if the interval has been changed in the meantime
                    float newInterval = interval;

                    if (intervalLocal != newInterval)
                    {
                        nextTrigger += newInterval - intervalLocal;
                        intervalLocal = newInterval;
                    }
                }

                if (!isRunning)
                    return;
            }


            float delay = elapsed - nextTrigger;
            if (delay >= ignoreElapsedThreshold)
            {
                fallouts += 1;
                continue;
            }

            Elapsed?.Invoke(this, new HiResTimerElapsedEventArgs(delay, fallouts));
            fallouts = 0;

            // restarting the timer in every hour to prevent precision problems
            if (stopwatch.Elapsed.TotalHours >= 1d)
            {
                stopwatch.Restart();
                nextTrigger = 0f;
            }
        }

        stopwatch.Stop();
    }

    private static float ElapsedHiRes(Stopwatch stopwatch)
    {
        return stopwatch.ElapsedTicks * tickFrequency;
    }
}

public class HiResTimerElapsedEventArgs : EventArgs
{
    public float Delay { get; }

    internal HiResTimerElapsedEventArgs(float delay)
    {
        Delay = delay;
    }
}

Edit 2021: Using the latest version that does not have the issue @hankd mentions in the comments.

Sign up to request clarification or add additional context in comments.

11 Comments

Then how does mono's implementation provide timers with a resolution of ~1ms? And why is the normal .NET framework not using such a solution?
That's what I cannot tell. IMHO all of the timer implementations should provide that, too (System.Timers.Timer, System.Threading.Timer, System.Windows.Forms.Timer). But if you cannot rely on them beyond a specific precision the question is not why but how you can accomplish your needs.
Heh. This is strange: you don't even need those tiered Sleep()s. I wrote a quick implementation that has the same precision as the mono-timer on all platforms using just WaitOne: pastebin.com/58QbCQVZ
Well, interesting solution, yet very unusual usage of WaitOne considering that your WaitHandle is never set (only in Dispose). Additionally, it has no sub-millisecond precision. As for the Sleeps: a simple SpinWait would be enough for the functionality but it spares the CPU better if there is more time until the next event. Btw, I upvoted you because I found your question interesting.
@hankd: The diff < 15f line is not quite correct because by default Thread.Sleep(1) takes about 15.5ms. I actually fixed this in my libraries recently, which is not released yet but here is the actual version that should work correctly.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.