What is the fastest I should run an NSTimer? - iphone

What is the fastest I can run an NSTimer and still get reliable results? I've read that approaching 30ms it STARTS to become useless, so where does it "start to start becoming useless"...40ms? 50ms?

Say the docs:
the effective resolution of the time
interval for a timer is limited to on
the order of 50-100 milliseconds
Sounds like if you want to be safe, you shouldn't use timers below 0.1 sec. But why not try it in your own app and see how low you can go?
You won't find a guarantee on this. NSTimers are opportunistic by nature since they run with the event loop, and their effective finest granularity will depend on everything else going on in your app in addition to the limits of whatever the Cocoa timer dispatch mechanisms are.

What's you definition of reliable? A 16 mS error in a 1 second timer is under 2% error, but in a 30 mS timer is over 50% error.
NSTimers will wait for whatever is happening in the current run loop to finish, and any errors in time can accumulate. e.g. if you touch the display N times, all subsequent repeating NSTimer firings may be late by the cumulative time taken by 0 to N touch handlers (plus anything else that was running at the "wrong" time). etc.
CADisplayLink timers will attempt to quantize time to the frame rate, assuming that no set of foreground tasks takes as long as a frame time.

Depends on what kind of results you are trying to accomplish. NSTimer Class 0.5 - 1.0 is a good place to start for reliable results.

Related

Millisecond timer?

Does anyone know how to make a timer with a time interval of 0.001? I read somewhere that a Timer can't have a time interval lower than 0.02, and my timer is behaving that way.
My timer:
timer = Timer.scheduledTimer(timeInterval: 0.001, target: self, selector: #selector(self.updateTimer), userInfo: nil, repeats: true)
#objc func updateTimer() {
miliseconds += 0.001
}
Yes, you can have time intervals less than 0.02 seconds. E.g. 0.01 second interval is easily accomplished. But at 0.001 or 0.0001 seconds, you’re starting to be constrained by practical considerations of how much you could accomplish in that period of time.
But if your goal is to represent something in the UI, there’s rarely any point for exceeding the device’s maximum frames per second (usually 60 fps). If the screen can only be updated every 60th of a second, what’s the point in calculating more frequently than that? That’s just wasted CPU cycles. By the way, if we were trying to achieve optimal screen refresh rates, we’d use CADisplayLink, not Timer. It works just like a timer, but is optimally timed for refreshing the screen at its optimal rate.
For what it’s worth, the idea of updating milliseconds to capture elapsed time is a bit of non-starter. We never update a “elapsed time counter” like this, because even in the best case scenarios, the frequency is not guaranteed. (And because of limitations in binary representations of fractional decimal values, we rarely want to be adding up floating point values like this, either.)
Instead, if we want to have the fastest possible updates in our UI, we’d save the start time, start a CADisplayLink, and every time the timer handler is called, calculate the elapsed time as the difference between current time and the start time.
Now, if you really needed a very precise timer, it can be done, but you should see Technical Note TN2169, which shows how to create high precision timer. But this is more of an edge-case scenario.
Short answer: You can't do that. On iOS and Mac OS, Timers are invoked by your app's run loop, and only get invoked when your app visits the main event loop. How often a short-interval timer actually fires will depend on how much time your app spends serving the event loop. As you say, the docs recommend a minimum interval of about 1/50th of a second.
You might look at using a CADisplayLink timer, which is tied to the display refresh. Those timers need to run fast in order to not interfere with the operation of your app.
Why do you think you need a timer that fires exactly every millisecond?

what is the granularity of the iPhone 4S timer?

Since the iPhone4S is reported to have a 800MHz clock , it might be that the timer has a 200MHz (5nsec) granularity? Anyone know what the shortest loop period might be?
From the NSTimer reference page:
Because of the various input sources a typical run loop manages, the
effective resolution of the time interval for a timer is limited to on
the order of 50-100 milliseconds. If a timer’s firing time occurs
during a long callout or while the run loop is in a mode that is not
monitoring the timer, the timer does not fire until the next time the
run loop checks the timer. Therefore, the actual time at which the
timer fires potentially can be a significant period of time after the
scheduled firing time.
Again, though, that's for NSTimer, which you generally use for triggering events that should happen at certain times or time intervals. If you're trying to measure elapsed time you can likely get a much more accurate result than 100ms, but I don't see a documented accuracy in the docs.
Try including mach/mach_time.h, mach_time() seems to provide microsecond resolution timing vales or better, and doesn't have non-monotonicity problems as NSDate does.

Will having two NSTimers in one class cause them to be less accurate?

I have one NSTimer which is my main game loop, it calls 60 times a second. I have another timer which is a countdown, which gets called 0.001 times a second. Is it normal that timer is not accurate? or should i call the countdown timer fewer times per second?
NSTimers are not accurate. The time interval you specify is simply a goal. The NSTimer will try to hit that goal but the more stuff you have running on that thread the slower the cycle time will be. Your 0.001 timer is probably going way too fast to be useful and will suffer from accuracy problems. If you need real time accuracy you will have to track what time the timer is actually firing with an NSDate and compensate accordingly.
If I'm reading your question correctly the second timer (which I assume is fired every 0.001) seconds is not going to be accurate.
NSTimer has a resolution of about 50-100 milliseconds (0.05s-0.1s) and this can be significantly impacted if your run loop is loaded.

iOS - Speed Issues

Hey all, I've got a method of recording that writes the notes that a user plays to an array in real time. The only problem is that there is a slight delay and each sequence is noticeably slowed down when playing back. I upped the speed of playback by about 6 miliseconds, and it sounds right, but I was wondering if the delay would vary on other devices?
I've tested on an ipod touch 2nd gen, how would that preform on 3rd, and 4th as well as iphones? do I need to test on all of them and find the optimal delay variation?
Any Ideas?
More Info:
I use two NSThreads instead of timers, and fill an array with blank spots where no notes should play (I use integers, -1 is a blank). Every 0.03 seconds it adds a blank when recording. Every time the user hits a note, the most recent blank is replaced by a number 0-7. When playing back, the second thread is used, (2 threads because the second one has a shorter time interval) that has a time of 0.024. The 6 millisecond difference compensates for the delay between the recording and playback.
I assume that either the recording or playing of notes takes longer than the other, and thus creates the delay.
What I want to know is if the delay will be different on other devices, and how I should compensate for it.
Exact Solution
I may not have explained it fully, that's why this solution wasn't provided, but for anyone with a similar problem...
I played each beat similar to a midi file like so:
while playing:
do stuff to play beat
new date xyz seconds from now
new date now
while now is not > date xyz seconds from now wait.
The obvious thing that I was missing was to create the two dates BEFORE playing the beat...
D'OH!
It seems more likely to me that the additional delay is caused by the playback of the note, or other compute overhead in the second thread. Grab the wallclock time in the second thread before playing each note, and check the time difference from the last one. You will need to reduce your following delay by any excess (likely 0.006 seconds!).
The delay will be different on different generations of the iphone, but by adapting to it dynamically like this, you will be safe as long as the processing overhead is less than 0.03 seconds.
You should do the same thing in the first thread as well.
Getting high resolution timestamps - there's a a discussion on apple forums here, or this stackoverflow question.

Most likely to be on-time: +timeWithTimeInterval or -performSelector:withObject:afterDelay:

I would like to play a sound for only a short span of time, shorter than the duration of the sound file. Therefore, at the same time as I start playing, I would like to queue up a task that will stop the sound.
I see two choices for queuing up the stop method: NSTimer and performSelector:withObject:afterDelay:
Does anyone know which of the two is most likely to trigger on time, or have a higher priority? It is not imperative that I get called with millisecond accuracy, but accuracy to 0.1 second would be great.
Addendum: Does anyone know where I could find documentation on the priority of different timer and delay tasks? For example, how would the below rank:
NSTimer tasks
performSelector
the calling of a view's drawRect after setNeedsDisplay has been called
the calling of a layer's drawing routines after setNeedsDisplay has been called
any other delayed tasks
And would it be useful to try to do this in a different thread?
performSelector:withObject:afterDelay: will create a timer to know when to fire so both approaches should be equally (un-)reliable. If your code performs an expensive calculation on the main thread and doesn't return control to the run loop or if the run loop has to process lots of events, the timer won't fire on time.
Have a NSTimer, but with a much smaller time interval than the one that your sound has to play. When your timer fires, and that is a couple of times while your sound plays, you would count how much time has passed since you started it. When that time exceeds a margin set by you, you stop the sound.
From my understanding, they should be pretty much the same. However, the best way to figure this out is to test it yourself. Run a bunch of tests and record the exact time, then see which is closer.