While playing sound acceleration occurs - iphone

I am developing one game where I want to move UIImageView based on accelerometer. When I rotate iphone device left to right or right to left the UIImageView have to rotate in particular angle. It's moving also but the problem occurs when I play background sound because of that sound, it sends some acceleration point even if my iphone is idle.
So my UIImageView is also moving. It should not happen. When I decrease the iphone sound volume it works very well. What I have to do for that.
And also if anyone knows how to get acceleration point only when iphone is moving from left to right or right to left. It should not detect when iphone is xz or yz plane.
If anybody knows the solution please reply.

Have you got any filtering on the input from the accelertometer? I would expect the noise from the speaker the accelerometer is picking up is vastly different in amplitude and frequency than the game control.
There is a simple low pass filter in the Apple accelerometer graph sample code.

Related

Adding up quaternions in a child to parent style

I have been struggling with some rotations maths for a feature on my project.
I am bassicly using a gyroscope input from a phone and combining a touch input in order to recreate the same behaviour as the youtube 360 video player input. (Using Unity)
So in other words im trying to add the touch input (only rotation on the X and Y Axis) to the gyroscope free to rotate in all angles.
I tried building 2 quaternion, one for the gyro and one quaternion with the touch input. If i start up the app and stay looking at the same direction with the phone, both are adding up fine, but if i change my phone orientation in the Y axis and start using the touch input up and down becomes the roll instead of the yaw.
I tried changing the quaternion addition order but it did not fix my issue.
After playing around with a minimal setup in Unity, i figured what i need to do is recreate the same relation a child and parent object have regarding rotation.
Sorry for the lack of capture and screenshots im trying to find the best way to document the issue.
Thanks

Can the iPhone detect height position?

I have an idea for an iPhone game / app that needs to be able to track height position of the iPhone. I am new to iPhone development so I don't know how the accelerometer works. But the idea is that the user should place the iphone on a flat surface (with the iPhone back against the surface). The user will the lower and raise the surface periodically and the iPhone should be able to track this movement. We can assume that the surface will go back to its original position so we only care about how much it was lowered / raised from its original position during the movement.
The amount raised / lowered will be a few centimeters. Is this possible to track and how would you go about solving this?
Thank you very much for your help!
Best regards,
Lukas
This is not possible to track directly. However, the accelerometer data can be used to sort of do that. Acceleration is the time-derivative of speed, which is the time-derivative of position. By integrating the acceleration twice, you can track position.
Caveat though: this will probably not be very accurate, with significant drift errors.
Now you can also track orientation with the magnetometer, and you can use the camera to "watch" the environment. This suggests the possibility to fix the position by triangulation.
I don't expect that to be easy though.

Is it possible to take picture from my iphone programmatically when there is some movement in front of camera?

I am trying to see if it is possible to take picture on iOS when there is any movement in front of the camera. For example, Is it possible for an app to take picture when some body wave hand or just show his/her face in front of the camera or throw a ball in the air such that it crosses the camera line of sight.
Check out Lucas-Kanade (LK) Optical Flow for the iPhone It's an optical flow estimation algorithm (i.e. "the pattern of apparent motion of objects, surfaces and edges in a visual scene caused by the relative motion between an observer and the scene.")
It might help you out with the motion detection and then you would probably just have to respond to that event.

How to get rotation around Y-axis relative to how the user holds the device?

I remember from WWDC that there was a talk showing a teapot in OpenGL ES, which rotated with movement of device. It appeared like the teapot stood still in space.
When the app launched, the teapot started in a specific position. Then when device got rotated, the teapot rotated too to stand still in space.
At this talk, they mentioned that we must get the "reference frame" e.g. upon app launch, which tells us how the user initially held the device.
For instance here's the accelerometer axis:
I want to know rotation around Y axis, but relative to how the user holds the device.
So when the user holds it upright and rotates around Y, I need to know that rotation value.
I think the key is removing the gravity from the readings? Also I target iPhone 4 / 4S with gyros, but I think CoreMotion would sensor-fusion them automatically.
How could I figure out by how much the user rotated the device around the Y-axis?
From your other question Why is this CMDeviceMotionHandler never called by CoreMotion? I know that you working on iOS 4 - things have changed slightly in iOS5. In general gyro data or even better sensor fusion of accelerometer and gyro data as done in DeviceMotion is the best approach for getting proper results.
So if you got this up and running, you will need to work with CMAttitude's multiplyByInverseOfAttitude method to get all CMDeviceMotion results relative to your reference frame. Just store a reference of the very first CMAttitude in a class member and call multiplyByInverseOfAttitude with it on all subsequent calls. Then all members of CMDeviceMotion.attitude will refer to this reference frame.
For getting the rotation around Y axis, a first approach is to take Euler angles i.e. CMAttitude.roll. If you just need to track small motions this might be fine. If motions are more extensive, you will run into trouble regarding Gimbal Lock. Then you need advanced techniques like quaternion operation to get stable results, but this sounds like an own question.

How can I detect if an iPhone is rotating while being face up in a table?

Is there a way to detect if an iphone lying down in a table face up is rotating?. I do realize that this kind of movement is not reported by the accelerometer and neither is it reported to the - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation method.
Is there a way to detect angle variations for the phone rotating this way?. Thank you.
The gravity vector will be constant as it rotates on a flat table so you won't see anything on the accelerometers. You could follow compass heading changes to detect this rotation but only on an iPhone 3G S. See the CLLocationManager for details, look at the heading methods.
EDIT - With an iPhone 4 you can detect the rotation using the gyros. There is a new class in iOS 4 called CMMotionManager for getting rotation rate from the gyros.
When the phone is stationary the sum of the acceleration vectors should be +1. When the phone is rotating (assuming the sensor is off-center) the sum of the vectors should be more than 1 and (hopefully) somewhat constant.
If you look at the decay of that curve, I wouldn't be surprised if that shape is distinctive enough to be used to determine whether the phone is rotating or not.
This is the AccelerometerGraph sample app from Apple.
I guess you could do it if the iPhone has a compass. Other than that I don't think it will be possible or reliable.
This would really depend on the location of the accelerometer on the device, i just tested this using the accelerometergraph sample application on a 2g itouch and you can see the initial acceleration on the x and y axis(the 2g does not have the accelerometer in the center of the device I guess). So in a sense you could detect the rotation, however I think the challenge would be differentiating that acceleration from directional acceleration. And I'm sure the values would change if apple placed the accelerometer in different locations on different models. There would definitally not be any way of doing it via shouldAutorotateToInterfaceOrientation, I recommend you load the accelerometergraph sample application in the sdk and experiment with the acceleration vectors to see if you can isolate a rotation vector reliably on multiple devices.