hi I want to get the acceleration of my iPhone but I dont want the acceleration values to change when the iPhone is tilted.
and I think the answer is the userAcceleration.
but I don't know how to get the userAcceleration values.
I know that I have to use the core motion and use the CMdeviceMotion but i don't know how to initialize and set it up.
I know this question was asked awhile ago, but I'm hoping I can provide some interesting perspective if you're still interested.
userAcceleration will provide watered-down (not raw) data comprised of sets of acceleration & gyroscopic information. You can get the raw acceleration data from CMMotionManager with the method -accelerometerData:
Unfortunately, the purpose of the accelerometer on the iOS device is to determine movement and orientation in a 3-dimensional axis: X, Y and Z coordinates. The iOS system doesn't differentiate between "tilting" and "movement" - they're one in the same. I don't know what purpose you have for divvying out the too, but that's what's laid out in the CoreMotion framework for us.
Related
I just want to know that when we call the method of startGyroUpdates with CMMotionManager and fix some updateInterval say to 1.0/60.0 , then is there any delegate method that we have to implement where we can get the gyro updates. If not then where/how we can get the gyro-updates.
Also if there is some useful code snippet to find out device change in position i-e if the device is moved up or down from some reference point.
Documentation says:
startGyroUpdates
Starts gyroscope updates without a handler.
- (void)startGyroUpdates
Discussion
You can get the latest gyroscope data through the gyroData property. You must call stopGyroUpdates when you no longer want your application to process gyroscope updates.
Availability
Available in iOS 4.0 and later.
See Also
– startGyroUpdatesToQueue:withHandler:
Declared In
CMMotionManager.h
Adding to xs2bush's correct answer: See the documentation links in Simple iPhone motion detect for more information.
Regarding the second point moved from some reference point, definitely not. At the moment there is no way at all to determine displacement with an acceptable precision. There are several questions and discussions about this like
Getting displacement from accelerometer data with Core Motion or
Measuring time the vehicle takes to accelerate in iPhone (I don't believe the 3% ;-)
I want to do something with an app that if you throw the iphone into the air, or you are airborne with it, then some pattern changes.
I thought it was 0, but lately, I've been getting some doubts.
Or, maybe, if there isn't a fixed accelerometer value when the iphone is airborne, what is the accelerometer values when the home button is on the bottom part of the iphone (like when you normally would hold it)?
Write a Logger App. Run it. Throw your phone into the air. Read logs when you catch it.
This is the link to the code that will show you Accelerometer values.
Just implement in your iPhone/iPod and throw the device in the air and you will get the X, Y and Z value.
there on you can perform the task you need,.
EDIT: My guess: The raw accelerometer values won't be 0 because you always have gravity effects in combination with rotation of the device in the air. You will rarely get a motion without spin and thus the rotation might influence even userAceleration values i.e. the acceleration delivered by core motion fusion algorithm.
What acceleration characterization should be used to differentiate an intentional single shake-to-do-something shake versus other typical random or unintentional device motions?
Shaking can be detected by the OS itself. There's no need to do this yourself.
More info in this StackOverflow question:
How to use Shake API in iPhone SDK 3.0?
Yeah, I'm currently wondering about this.
In my use case the devices will be 50cm to 10m apart and I'd like it to be accurate to at least 10 cm. (Therefore GPS is not an option)
2 Ways spring to mind:
Sound: I asked about this in the dev forums and I'm in contact with laanlabs, about the code of their sonar ruler.
Picture on one device + Camera on the other: Seems easier to set up, since my user case involves the user facing one device at 90 degrees anyway. But it would be more work for the user to face the camero into the direction and it would not react to a change in distance.
Now the question: Is anyone aware of any code that does something like this already? Possibly a non-iPhone general c-Project?
Method with camera: we already know size for each device. You take a picture of device, calculate it's height/width to determine type of device (iPhone/iPod or iPad), than calculate a distance.
For example - if device is iPhone you know, that its size is 115x58 mm. On picture it NxM pixels. Now you can calculate the distance. (If N & M smaller hence distance is larger)
If you were to use the sound method one approach would be to have device A emit a sound, device B would then be listening for this at all times and on detection echo back a secondary sound. This would give you a round-trip time from which you could calculate distance - don't forget to compensate for latency between detection re-emission as well.
I am not sure about but this is what i found from one of the answers in this previous SO question How to measure distance between two iphone devices using bluetooth?
Using bluetooth for localization is a very well known research field . The short answer is: you can't. Signal strength isn't a good indicator of distance between two connected bluetooth devices, because it is too much subject to environment condition (is there a person between the devices? How is the owner holding his/her device? Is there a wall? Are there any RF reflecting surfaces?). Using bluetooth you can at best obtain a distance resolution of few meters, but you can't calculate the direction, not even roughly.
You may obtain better results by using multiple bluetooth devices and triangulating the various signal strength, but even in this case it's hard to be more accurate than few meters in your estimates.
Anyone know if it is yet possible to detect the touch shape? Maybe through getting the raw touchscreen data?
I found this question/answer here: How to get raw touchscreen data?
That mentions GSEvent, but it is quite old.
I'd like to try to get a rough calculation of the pressure of the touch by its shape/area, but of course UITouch only gives a calculated point.
Yes, raw touch data is contained in the GSEventRecord object, particularly what you are looking for is the pathMajorRadius property on GSPathInfo, which gives the major radious on the tap. This is a rough estimate of the pressure, but take into account big/small fingers give also different measures.
Watch out for the pathPressure property also in GSPathInfo, it does NOT contain the pressure. It always contains 1, capacitive screens (like the iPad's or IPhone's) do not measure pressure at all.
If you are planning submitting your app in the app store, you won't be able to do it if you include access to private frameworks (like in this case, GSEvent.h in the GraphicServices framework). But what you need to do is catch every UIEvent in the sendEvent method of your subclassed UIApplication, then use the methods in
https://github.com/kennytm/iphone-private-frameworks/blob/master/GraphicsServices/GSEvent.h
to get the information of the GSEvent.