iOS sensor fusion is laggy and inaccurate, can I fix that? - iphone

I'm playing around with an app that overlays an image on top of an AV view, a sort-of-but-not-really AR app. You use it by holding the camera up in front of you like you're taking a picture of friends in front of you. Currently, it just displays a compass rose on a SceneKit object.
It basically works, but on my 5S and 6 and I'm finding it is:
laggy on startup, with the pointing vector being off by as much as 180 degrees
not terribly accurate, with errors even after its been running of something like 10 degrees
the error in (2) changes, so it's idea of "south" moves around over time
Ultimately I'm going to need the accuracy to be on the order of 5 degrees. Can anyone comment on whether this sort of accuracy is possible, if there's a way to test it, and perhaps comment on platform issues - is it better on the 7 for instance?

Related

How to get mobile yaw angle for Unity gyroscope?

I use Unity remote 5 to connect my phone to the Editor. I am reading the values Input.gyro.attitude.eulerAngles and am some-have confused. Because not matter how I orient my device, all angles change. I was hoping to get specific one that reflects the camera yaw rotation. Angle that would change only if I rotate my device like in image below and stay constant if I rotate it differently.
P.S. When I observe Debug.Log(gyroInp.attitude.eulerAngles), the Z angle keeps growing even phone stands still on table. From 276 to 350 and continues to grow. Is it because the earth is turning? :)
Unity Remote 5 (UR5) seems to be broken for reading gyro as indicated by this (more specific) issue in Unity's issue tracker: https://issuetracker.unity3d.com/issues/ios-13-dot-0-rotations-around-device-y-axis-does-not-work-when-rotating-a-device-and-using-remote
If I use the UR5 to read gyro attitude/acceleration in the editor it gives an initial value that is never updated but only repeated for each read after that. The actual build on my (iOS 14) device works as expected.
So the Unity Remote 5 is unfortunately broken for all things gyro, it may be an iOS specific issue.

Possible to measure distance with an iPhone and laser pointer?

I want to make this phone app that can measure distance. I want to know how far away something is from the phone. So let's say I want to know how far away a wall is...I'd like my phone to tell me.
So how might this work? Well, I can shine an ordinary red laser pointer against the wall (in the dark) and have the phone's camera "see" the dot.
The further away the phone is from the dot, the smaller the dot will be. The picture below shows the dot from 1 foot, 5 feet, 10 feet, 20 feet, and 25 feet away. I think the app could then measure the size of the dot and figure out how far away the dot actually is from the phone. And then it could use a simple ratio or formula to determine distance for other sizes.
So my question is: Would this likely work for measuring distance?
As long as you can clearly identify the laser dot I would say:Yes. I'm seeing the problem in doing the identifying.
The distance measuring is just some maths and physics stuff (I'm not sure what to use but I think there is maybe some useful optics stuff... intercept theorem?) or you can create an "algorithm" by testing(but then accuracy could be a problem ;))
So, I think the dual laser pointer idea is superior to the single laser pointer idea.
But, because the camera is moving further and further away from the wall, I believe angling the laser pointers is a non-solution.
Instead, I think keeping the laser pointers in parallel is the solution. This way, the further back you go, the beams will look closer and closer together in the photo, but they will still remain the same distance apart. Then you can easily come up with some formula to measure the distance based on how far apart the dots are in the photo.

Gravity as frame of reference in accelerometer data in iOS

I'm working on an iPhone app for motorcyclist that will detect a crash after it has occurred. Currently we're in the data acquisition process and plotting graphs and looking at data. What i need to log is the forward user acceleration and tilt angle of the bike relative to bike standing upright on the road. I can get the user acceleration vector, i.e. the forward direction the rider is heading by sqrt of the x,y and z accelerometer values squared. But for the tilt angle i need a reference that is constant, so i thought lets use the gravity vector. Now, i realize that deviceMotion API has gravity and user acceleration values, where do these values come from and what do they mean? If i take the sqrt of the x,y and z squared components of the gravity will that always give me my up direct? How can i use that to find the tilt angle of the bike relative to an upright bike on the road? Thanks.
Setting aside "whiy" do this...
You need a very low-pass filter. So once the phone is put wherever-it-rides on the bike, you'll have various accelerations from maneuvers and the accel from gravity ever present in the background. That gives you an on-going vector for "down", and you can then interpret the accel data in that context... Fwd accel would tip the bike opposite of braking, so I think you could sort out fwd direction in real time too.
Very interesting idea.
Assuming that it's not a "joke question" you will need a reference point to compare with i.e. the position taken when the user clicks "starting". Then you can use cos(currentGravity.z / |referenceGravity|) with |referenceGravity| == 1 because Core Motion measures accelerations in g.
But to be honest there are a couple of problems for instance:
The device has to be in a fixed position when taking the reference frame, if you put it in a pocket and it's just moving a little bit inside, your measurement is rubbish
Hmm, the driver is dead but device is alive? Chances are good that the iPhone won't survive as well
If an app goes to the background Core Motion falls asleep and stops delivering values
It has to be an inhouse app because forget about getting approval for the app store
Or did we misunderstand you and it's just a game?
Since this is not a joke.
I would like to address the point of mount issue. How to interpret the data depends largely on how the iPhone is positioned. Some issues might not be apparent to those that don't actually ride motorcycles.
Particularly when it comes to going around curves/corners. In low speed turns the motorcycle leans but the rider does not or just leans slightly. In higher speed turns both the rider and the motorcycle lean. This could present an issue if not addressed. I won't cover all scenarios but..
For example, most modern textile motorcycle jackets have a cell phone pocket just inside on the left. If the user were to put there phone in this pocket, you could expect to see only 'accelerating' & 'braking'(~z) acceleration. In this scenario you would expect to almost never see significant amounts of side to side (~x) acceleration because the rider leans proportionally into the g-force of the turn. So while going around a curve one would expect to see an increase in (y)down from it's general 1g state. So essentially the riders torso is indexed to gravity as far as (x) measurements go.
If the device were mounted to the bike you would have to adjust for what you would expect to see given that mounting point.
As far as the heuristics of the algorithm to detect a crash go, that is very hard to define. Some crashes are like you see on television, bike flips ripping into a million pieces, that crash should be extremely easy to detect, Huh 3gs measured up... Crash! But what about simple downs?(bike lays on it's side, oops, rider gets up, picks up bike rides away) They might occur without any particularly remarkable g-forces.(with the exception of about 1g left or right on the x axis)
A couple more suggestions:
Sensitivity adjustment, maybe even with some sort of learn mode (where the user puts the device in this mode and rides, the device then records/learns average riding for that user)
An "I've stopped" or similar button; maybe the rider didn't crash, maybe he/she just broke down, it does happen and since you have some sort of ad-hoc network setup it should be easy to spread the news.

iOS: Get how fast user is moving

I'm wanting to figure out if a user is not moving at all, walking, or running using the iPhone. I'm not trying to implement a pedometer. I just want to know around about if someone is moving briskly, slowly, or not at all. I don't need mph or anything like that.
I think the accelerometer may be able to do this for me, but I was wondering if someone knows of any tutorials or example code that might be able to point me in the right direction?
Thanks to all that reply
The accelerometer won't do you any good here - it will only capture changes in velocity.
Just track the current location periodically and calculate the speed.
There are no hard thresholds for walking vs. running motion, so you will have to experiment a bit. The AccelerometerGraph sample code should get you started on how to get and interpret accelerometer data.
The Accelerometer is good, but if the user has an iPhone 4 or iPad 2 you should use the gyroscope.
CMMotionManager and Event Handeling Guide - Motion Events
Apple Documentation is the best example you can get!
People have a different bounce in their step between walking and running which can be measured with the accelerometer, but this differs between individuals (what shoes they are wearing, what surface they are upon, what part of the body is attached to the iPhone etc.), and this motion can probably be imitated by shaking the iPhone just right while standing still.
Experiment by recording the two types of acceleration profiles, and then use some sort of pattern matching to pick the most likely profile candidate from the current recorded acceleration data.

how would I use iphone motion detection for an egg shaking-like application?

I am hoping to build an application similar to those egg shaking applications, to better understand how to detect motion on the iphone. I've been looking at accelerometer methods and motion and motion methods, but can't seem to get working what I want to do.
The specifics of my need are as follows: I want to be able to play one sound when user shakes the phone away from them, and play another sound when they shake back towards them. The motion from the user would be very similar to an egg shaker, with two different sounds able to be played depending on whether they moved the device towards or away from their chest. It would also be good to measure the intensity with which they moved away or towards.
Any ideas?
I've searched apple's sample code for a similar application, but there doesnt seem to be one.
Look at this game which is open source and makes great use of the accelerometer. It's a good one to be able to tell which direction you are going, but I haven't messed around much with intensity. I'm sure it's easy enough once you get into the details.
http://github.com/haqu/tweejump