I'm trying to acquire the current location on an iOS device (specifically, my iPhone).
I'm using this Apple example.
The timeout before I call stopUpdatingLocations is 60 seconds.
When I set the desiredAccuracy to be kCLLocationAccuracyHundredMeters, only 3 newLocations arrive to the didUpdateToLocation:fromLocation. The first one is the cached one, from long time ago. The next two ALSO have timestamps from more than 15 seconds, despite the fact that all 3 arrive within 5 seconds interval. All three of them contain bad horizontal accuracy.
On the other hand, when I set the desiredAccuracy to be kCLLocationAccuracyNearestTenMeters, newLocations continue to arrive until I get an appropriate, exact one.
My WiFi is off and I'm indoors.
My question is: why when using the kCLLocationAccuracyHundredMeters accuracy I stop receiving updates early?
Because when the location returned immediately satisfies the nearest accuracy, it stops? If this is not the case, can you show any logs or anything to further explain the situation if I have perhaps missed the point.
Related
I realize the answer will most likely vary based on the desired accuracy. I'm most interested in 3km accuracy (kCLLocationAccuracyThreeKilometers), but data for the other levels would also be useful.
I'm suppose I'm not exactly looking for the average time, but the point in time that I should move on and assume I'm not going to get any more accurate locations. In my use case, the GPS coordinates are not essential to my app, but highly useful.
At that distance, it is unlikely GPS will be used, as the OS will opt for cell tower or wifi triangulation. Therefore, the time is likely to be less than 42 seconds, which seems very high in its own right.
Although I have no specific data on this, I have observed - through testing our own app - that geolocation takes approximately between ten and twenty seconds.
I want to be able to rank users based on how quick they have completed each level. I want this to be an overall leaderboard I.e. shortest overall time for all levels.
The problem here is that for each level completed the totally completion time goes up. But I want to ensure that the leaderboard takes that into account so that a user having completed 10 levels will rank more highly than someone with only 1 completed level.
How can I create some kind of score based on this?
Before submitting the time to leader board.
You could perform a modulation on the total time by the number of levels completed, then for each level completed reduce it by a set amount so people who complete all levels with the same average time will score better then people with the same average time but with fewer levels.
My Preferred Method:
Or you could express it with a score value.
level complete = 1,000.
Each level has a set time limit bonus, the longer you take the less bonus u get.
eg
I Complete the level in 102 secs Goal time is 120 secs
I get 1,000 points for completion and 1,500 points for each second
that i beat the Goal time for.
This way i will get 1,000 + (18* 1,500) = 28,000 points
Next guy completes in 100 secs
He Gets 1,000 + (20*1,500) = 31,000 points
I suggest adding a default amount of time to the total for each incomplete level. So, say, if a player beats a new level in 3 minutes, that replaces a 10 minute placeholder time, and they 'save' 7 minutes from the total.
Without that kind of trick, the iPhone has no provision for multi-factor rankings.
Leaderboard scores in GameKit have to be expressed as a single number (see this section of the GameKit Programming Guide), so that won't be possible.
Your best bet would be to just have a completion time leaderboard for people who have completed all the levels, and maybe another leaderboard (or a few) for people who have completed a smaller number of levels.
Hey all, I've got a method of recording that writes the notes that a user plays to an array in real time. The only problem is that there is a slight delay and each sequence is noticeably slowed down when playing back. I upped the speed of playback by about 6 miliseconds, and it sounds right, but I was wondering if the delay would vary on other devices?
I've tested on an ipod touch 2nd gen, how would that preform on 3rd, and 4th as well as iphones? do I need to test on all of them and find the optimal delay variation?
Any Ideas?
More Info:
I use two NSThreads instead of timers, and fill an array with blank spots where no notes should play (I use integers, -1 is a blank). Every 0.03 seconds it adds a blank when recording. Every time the user hits a note, the most recent blank is replaced by a number 0-7. When playing back, the second thread is used, (2 threads because the second one has a shorter time interval) that has a time of 0.024. The 6 millisecond difference compensates for the delay between the recording and playback.
I assume that either the recording or playing of notes takes longer than the other, and thus creates the delay.
What I want to know is if the delay will be different on other devices, and how I should compensate for it.
Exact Solution
I may not have explained it fully, that's why this solution wasn't provided, but for anyone with a similar problem...
I played each beat similar to a midi file like so:
while playing:
do stuff to play beat
new date xyz seconds from now
new date now
while now is not > date xyz seconds from now wait.
The obvious thing that I was missing was to create the two dates BEFORE playing the beat...
D'OH!
It seems more likely to me that the additional delay is caused by the playback of the note, or other compute overhead in the second thread. Grab the wallclock time in the second thread before playing each note, and check the time difference from the last one. You will need to reduce your following delay by any excess (likely 0.006 seconds!).
The delay will be different on different generations of the iphone, but by adapting to it dynamically like this, you will be safe as long as the processing overhead is less than 0.03 seconds.
You should do the same thing in the first thread as well.
Getting high resolution timestamps - there's a a discussion on apple forums here, or this stackoverflow question.
to all i want to know why location manager in iphone gives wrong coordinate at first time when run application.Due to this my distance is come 100 meter at start of application and my average speed is also effected due to this
Each location you receive will have a horizontal accuracy. If the accuracy is above some threshold, say 10 meters, then disregard it. It will take longer to get an accurate read. A negative accuracy means unknown and should also be discarded.
You could also keep your current logic, but reset all data the first time the accuracy is below your threshold. You will still be disregarding inaccurate data, but you can give the user some initial feedback the way map programs do.
Which approach to use depends on your application.
i want to create a button that allows the user to tap on it and thereby set a beats per minute. i will also have touches moved up and down on it to adjust faster and slower. (i have already worked out this bit).
what are some appropriate ways to get the times that the user has clicked on the button to get an average time between presses and thereby work out a tempo.
Overall
You best use time() from time.h instead of an NSDate. At the rate of beats the overhead of creating an NSDate could result in an important loss of precision.
I believe time_t is guaranteed to be of double precision, therefore you're safe to use time() in combination with difftime().
Use the whole screen for this, don't just give the user 1 small button.
Two idea
Post-process
Store all times in an array.
Trim the result. Remove elements from the start and end that are more than a threshold from the average.
Get the average from the remaining values. That's your speed.
If it's close to a common value, use that.
Adaptive
Use 2 variables. One is called speed and the other error.
After the first 2 beats calculate the estimated speed, set error to speed.
After each beat
queue = Fifo(5) # First-in, first-out queue. Try out
# different values for the length
currentBeat = now - timeOflastBeat
currentError = |speed - currentBeat|
# adapt
error = (error + currentError) / 2 # you have to experiment how much
# weight currentError should have
queue.push(currentBeat) # push newest speed on queue
# automatically removes the oldest
speed = average(queue)
As soon as error gets smaller than a certain threshold you can stop and tell the user you've determined the speed.
Go crazy with the interface. Make the screen flash whenever the user taps. Extra sparks for a tap that is nearly identical to the expected time.
Make the background color correspond to the error. Make it brighter the smaller the error gets.
Each time the button is pressed, store the current date/time (with [NSDate date]). Then, the next time it's pressed, you can calculate the difference with -[previousDate timeIntervalSinceNow] (negative because it's subtracting the current date from the previous), which will give you the number of seconds.