How to use accelerometer, CMMotion data to locate a point in 3D space? - iphone

I am creating an application. In which iPhone will be placed (a separate cover is made for it) with golf club(racket). I want to get array of points which state the path of the racket movement.
For example, I start collecting the data when racket is on the ground. After then user prepares himself for shot. So, he will take the racket back side and then he will hit the shot by moving racket forward. I want to catch all these points in 3D and want to plot them on screen (2D projection). I saw many similar questions, accelerometer, CMMotion framework documents. But could not find a way to doing so.
I hope, I have explained the question properly. Can you suggest me some formula or how to process the data to achieve it?
Thanks in advance.

You cannot track these movements in the 3D space.
But you can track the orientation of the racket and that should work well.
I have implemented a sensor fusion algorithm for the Shimmer platform, not a trivial task. I would use Core Motion and I would not try to create my own sensor fusion algorithm.
Hope this helps, good luck!

i tried the sensors fusion algorithm developed by Madgwick, but the output, on my device, it's similar to the CoreMotion Attitude output.
I don't have the possibility to test the attitude outputs from other iPhone, but in my case, the problem it's the yaw angle, even if the iphone it's fixed on the table the yaw angle tend to be unstable, probably due to the distinct chip-placement of z-axis gyro.

Related

How to move the game object according the movement of real world object in the web cam in unity?

i want to develop a tan gram game in unity with the concept of augmented reality. i want to make tan gram figures using real tan grams in front of a webcam ,according to the tan gram figure in the screen. For that i want to place the game object with respect to the real tan gram in the camera frame. i also want to change the position and angle accordingly. please suggest a way to achive this. Thanks in advance!!!!
With difficulty.
If you want to do this without some sort of custom built hardware controller on the real tan gram, you will need some quite intricate image processing techniques. The following are some vague steps and pointers to achieve what you want. If there is a better option I cannot think of it, but this is very conceptual and by no means guaranteed to work - Just how I would attempt the task if I really had to.
Use a Laplacian operator on the image to calculate the edges
Use this, along with the average colour information in pixels to the left/right and above/below of each "edge" pixel (within a certain tolerance) to detect the individual shapes, corners, and relative positions starting from the centre of the image.
Calculate the relative sizes of each shape and and approximate the rotation using basic trigonometry.
However I can't help but feel like this is an incredibly large amount of work for such a concept, and could be so intensive to calculate this for each pixel to make it truly not worth your time. Furthermore it depends a lot on the quality of the camera used, and parallax errors would probably be nightmarish to resolve. Unless you are truly committed to this idea, I would either search for some pre-existing asset that does this for you or not undertake the project.

accelerometers uses- smartphone

It is known that the raw accelerometer data is combination of both linear acceleration and gravity. In order to isolate them w need to apply appropriate filters. I would like to know the real time applications where we would need only "gravity" or only "linear acceleration".
Gravity is used when you are trying to figure out the orientation of the phone. In other words, when you are trying to figure out how the user holds the phone. It is good for tilt games, for example you use the phone to drive a car, etc.
Linear acceleration is used when you are trying to figure out how the phone is shaken. It good for shaking games.
I highly recommend this video. In particular, between 4:15-6:10 and staring from 33:30 you see demos.

iOS: Get how fast user is moving

I'm wanting to figure out if a user is not moving at all, walking, or running using the iPhone. I'm not trying to implement a pedometer. I just want to know around about if someone is moving briskly, slowly, or not at all. I don't need mph or anything like that.
I think the accelerometer may be able to do this for me, but I was wondering if someone knows of any tutorials or example code that might be able to point me in the right direction?
Thanks to all that reply
The accelerometer won't do you any good here - it will only capture changes in velocity.
Just track the current location periodically and calculate the speed.
There are no hard thresholds for walking vs. running motion, so you will have to experiment a bit. The AccelerometerGraph sample code should get you started on how to get and interpret accelerometer data.
The Accelerometer is good, but if the user has an iPhone 4 or iPad 2 you should use the gyroscope.
CMMotionManager and Event Handeling Guide - Motion Events
Apple Documentation is the best example you can get!
People have a different bounce in their step between walking and running which can be measured with the accelerometer, but this differs between individuals (what shoes they are wearing, what surface they are upon, what part of the body is attached to the iPhone etc.), and this motion can probably be imitated by shaking the iPhone just right while standing still.
Experiment by recording the two types of acceleration profiles, and then use some sort of pattern matching to pick the most likely profile candidate from the current recorded acceleration data.

Detect gesture accelerometer

I'm creating an iPhone application.
My application needs to detect the movements of the accelerometer and report the case of movement by the user.
In practice I have to constantly check that the coordinates received from accelerometer are equal to the coordinates saved in my movement.
My problem is that its implementation is not simple because I know there are many factors that create difficulties.
Someone knows a tutorial or guide to propose.
I also welcome suggestions ...
Thank you very much
You can use this paper as a starting point.
It shows detection of a user walking, running, walking up or down stairs or standing still. Even though it's based on Android the principle will be the same.

How to detect height of iPhone (for use in augmented reality game)?

I'm working on locating an iPhone device in 3D space.
I can use lat/long to detect physical location, I can use the magnetometer to figure out the direction they're facing, and I might be able to use the accelerometer to figure out how their device is oriented, but I can't figure out a way to get height of the device off the floor.
Specifically, I need to know if the user is squatting down, or raising their hand toward the ceiling (a different of about 2 meters/6 feet).
I posted a more detailed description of what I'm trying to do on my blog: http://pushplay.net/blog_detail.php?id=36
I would love any suggestions as to how to even fake this sort of info. I really want the sort of interactivity and movement that would require ducking and bobbing, versus just letting someone sit back and angle the phone -- kind of the way people can "cheat" playing with a Wii...
The closest I could see you getting to what you're looking for is using the accelerometer/magnetometer as an inertial tracker. You'd have to calibrate the user's initial position on startup to a "base" position, then continuously sample the sensors on a background thread to build a movement model. This post talks about boosting the default sample rate of the accelerometer functions so that you can get a pretty fine-grained picture of the user's movements.
I'm not sure this will solve your concern about people simply angling the device to produce the desired action, but you will have to strike a balance between being too strict in interpreting movements and allowing for differences in movement
The CoreLocation stuff gives you elevation aswell as lat/long, so you could potentially use that although there are some significant problems with this:
Won't work well indoors (not a problem for Sat Nav, is a problem for games)
Your users would have to "calibrate" (probably by placing the phone on the floor) each location they use!
In fact, you'd need to start keeping a list of "previously calibrated locations"... which could vary hugely just in one house (eg multiple rooms and floors). Could get in the way of the game.
Can't be used on moving transport (tranes, planes, automobiles... even walking) because elevation changing so frequently.
Therefore I'd have thought that using the accelerometer as a proxy for height is a substantially more preferable route than determining absolute elevation.
I am not intimately familiar with the iphone. But it might require a hardware add-on. (which you probably don't want to do). After thinking on this the only way I know how is through light or more specific laser. You shoot out a laser on the floor and record the time it takes to get back. It's actually not a lot to put this hardware together and I am sure the iphone has connections for peripherals. Unless osmeone can trump me, I say ther eis no way to do that with an image.