Swift: Is there a maximum timer speed? - swift

For my question, I am referencing Timer in Swift, from the Apple Documentation:
Timer.scheduledTimer(timeInterval ti: TimeInterval,
target aTarget: Any,
selector aSelector: Selector,
userInfo: Any?,
repeats yesOrNo: Bool) -> Timer
My question is about the timeInterval parameter: Is there a decimal value greater than 0 where the timer speed would remain constant? I am asking because after doing some tests, using an interval of 0.00001 and 0.00000001 did not seem to produce any noticeable differences. I did not come across an answer in the documentation nor in Google reasearch.

In iOS, an NSTimer is called in the runloop of the main thread, and thus the maximum rate at which it can reliably be called is usually between 30 to 60 times per second (perhaps up to 120 times per second on a newer iPad Pro), related to the display frame rate. Thus, the shortest reliable time interval is usually in the range of 8 to 33.3 milliseconds. Certainly not 10 uS.
A more reliable timer is to use CADisplayLink, which is usually locked to the display frame rate of 60 frames per second more reliably than an NSTimer, which seems to have lower priority, thus greater latency or fire-time jitter. For finer timing resolution, you might try using a GCD dispatch timer, which supports a leeway parameter, in a dedicated thread, or even sit in a spin loop polling a mach timer.

Related

What happens to Flutter Ticker if the UI misses frames due to bad performances?

I read on the the Ticker class documentation that:
"Ticker class calls its callback once per animation frame."
I am using createTicker(TickerCallback onTick) to implement a stopwatch. So I need the elapsed variable passed to the TickerCallback to be extremely precise (i.e. i need that after 5 seconds, the value of elapsed is exactly 5 seconds).
Now my question is: what happens if I have a sluggish UI very badly coded and that misses a lot of frame due to bad optimization? I can think of 2 cases:
The time of the stopwatch gets updated not at 60fps (because of my bad coding) but once it gets updated, the time being displayed is correct
The time displayed is wrong
Other?
Which is the case? And why (most importantly)? Also, considering the above, is it adviceable to use ticker for a stopwatch? Thanks
To answer your question(s):
1.The time of the stopwatch gets updated not at 60fps (because of my bad coding) but once it gets updated, the time being displayed is correct.
If phone works at 120 fps, does that mean it will forward time :)
Flutter aims to provide 60 frames per second (fps) performance, or 120 fps performance on devices capable of 120Hz updates. For 60fps, frames need to render approximately every 16ms. Jank occurs when the UI doesn't render smoothly.
So you may use ticker, and even if animation is sluggish, still it will display right time. Like on let say we have some delays on frames 500, they will be delays of animation not the time passed. Like on second 3 we have 1 second delay we will have 5 after that, it updates the screen, but timer will continue.
Also, considering the above, is it adviceable to use ticker for a stopwatch?
It is. At worst case you will have drop frames, jumping seconds but timer will be exact.

Millisecond timer?

Does anyone know how to make a timer with a time interval of 0.001? I read somewhere that a Timer can't have a time interval lower than 0.02, and my timer is behaving that way.
My timer:
timer = Timer.scheduledTimer(timeInterval: 0.001, target: self, selector: #selector(self.updateTimer), userInfo: nil, repeats: true)
#objc func updateTimer() {
miliseconds += 0.001
}
Yes, you can have time intervals less than 0.02 seconds. E.g. 0.01 second interval is easily accomplished. But at 0.001 or 0.0001 seconds, you’re starting to be constrained by practical considerations of how much you could accomplish in that period of time.
But if your goal is to represent something in the UI, there’s rarely any point for exceeding the device’s maximum frames per second (usually 60 fps). If the screen can only be updated every 60th of a second, what’s the point in calculating more frequently than that? That’s just wasted CPU cycles. By the way, if we were trying to achieve optimal screen refresh rates, we’d use CADisplayLink, not Timer. It works just like a timer, but is optimally timed for refreshing the screen at its optimal rate.
For what it’s worth, the idea of updating milliseconds to capture elapsed time is a bit of non-starter. We never update a “elapsed time counter” like this, because even in the best case scenarios, the frequency is not guaranteed. (And because of limitations in binary representations of fractional decimal values, we rarely want to be adding up floating point values like this, either.)
Instead, if we want to have the fastest possible updates in our UI, we’d save the start time, start a CADisplayLink, and every time the timer handler is called, calculate the elapsed time as the difference between current time and the start time.
Now, if you really needed a very precise timer, it can be done, but you should see Technical Note TN2169, which shows how to create high precision timer. But this is more of an edge-case scenario.
Short answer: You can't do that. On iOS and Mac OS, Timers are invoked by your app's run loop, and only get invoked when your app visits the main event loop. How often a short-interval timer actually fires will depend on how much time your app spends serving the event loop. As you say, the docs recommend a minimum interval of about 1/50th of a second.
You might look at using a CADisplayLink timer, which is tied to the display refresh. Those timers need to run fast in order to not interfere with the operation of your app.
Why do you think you need a timer that fires exactly every millisecond?

what is the granularity of the iPhone 4S timer?

Since the iPhone4S is reported to have a 800MHz clock , it might be that the timer has a 200MHz (5nsec) granularity? Anyone know what the shortest loop period might be?
From the NSTimer reference page:
Because of the various input sources a typical run loop manages, the
effective resolution of the time interval for a timer is limited to on
the order of 50-100 milliseconds. If a timer’s firing time occurs
during a long callout or while the run loop is in a mode that is not
monitoring the timer, the timer does not fire until the next time the
run loop checks the timer. Therefore, the actual time at which the
timer fires potentially can be a significant period of time after the
scheduled firing time.
Again, though, that's for NSTimer, which you generally use for triggering events that should happen at certain times or time intervals. If you're trying to measure elapsed time you can likely get a much more accurate result than 100ms, but I don't see a documented accuracy in the docs.
Try including mach/mach_time.h, mach_time() seems to provide microsecond resolution timing vales or better, and doesn't have non-monotonicity problems as NSDate does.

Will having two NSTimers in one class cause them to be less accurate?

I have one NSTimer which is my main game loop, it calls 60 times a second. I have another timer which is a countdown, which gets called 0.001 times a second. Is it normal that timer is not accurate? or should i call the countdown timer fewer times per second?
NSTimers are not accurate. The time interval you specify is simply a goal. The NSTimer will try to hit that goal but the more stuff you have running on that thread the slower the cycle time will be. Your 0.001 timer is probably going way too fast to be useful and will suffer from accuracy problems. If you need real time accuracy you will have to track what time the timer is actually firing with an NSDate and compensate accordingly.
If I'm reading your question correctly the second timer (which I assume is fired every 0.001) seconds is not going to be accurate.
NSTimer has a resolution of about 50-100 milliseconds (0.05s-0.1s) and this can be significantly impacted if your run loop is loaded.

What is the fastest I should run an NSTimer?

What is the fastest I can run an NSTimer and still get reliable results? I've read that approaching 30ms it STARTS to become useless, so where does it "start to start becoming useless"...40ms? 50ms?
Say the docs:
the effective resolution of the time
interval for a timer is limited to on
the order of 50-100 milliseconds
Sounds like if you want to be safe, you shouldn't use timers below 0.1 sec. But why not try it in your own app and see how low you can go?
You won't find a guarantee on this. NSTimers are opportunistic by nature since they run with the event loop, and their effective finest granularity will depend on everything else going on in your app in addition to the limits of whatever the Cocoa timer dispatch mechanisms are.
What's you definition of reliable? A 16 mS error in a 1 second timer is under 2% error, but in a 30 mS timer is over 50% error.
NSTimers will wait for whatever is happening in the current run loop to finish, and any errors in time can accumulate. e.g. if you touch the display N times, all subsequent repeating NSTimer firings may be late by the cumulative time taken by 0 to N touch handlers (plus anything else that was running at the "wrong" time). etc.
CADisplayLink timers will attempt to quantize time to the frame rate, assuming that no set of foreground tasks takes as long as a frame time.
Depends on what kind of results you are trying to accomplish. NSTimer Class 0.5 - 1.0 is a good place to start for reliable results.