Calculate distance Using UIAccelerometer in xcode - iphone

I am implementing an app which measures the how much distance it has moved .For example if my device felldown from my table to ground ,then I would like to calculate the distance.So Kindly help me to do this. Let me know if my question is not clear.
Thanks in advance.

Your question is very clear : you want to compute the second level integral of the acceleration, which theorically is possible, by supposing the speed null at some time, but I really doubt you could get something precise enough to make any sense (as in many integral computations).
This isn't done today because the error is too big. Done in hardware (for permanent integration of the acceleration) it could be a little more precise but probably not enough to really compute a distance in any acceptable sense of the word "accuracy".
If you want to try it by yourself, here's a document describing more in detail the approach : http://perso-etis.ensea.fr/~pierandr/cours/M1_SIC/AN3397.pdf

Related

Maximum Likelihood, Matlab

I'm writing code, that executes MLE. At each step, I get gradient at one point and then move along it to another point. But I have problem with determination of magnitude of the move. How to determine the best magnitude for good convergence? Can you give me an advice how to avoid other pitfalls, such as presence of several maximums?
Regarding the presence of several maxima: this issue will occur when dealing with a function that is not convex. It can be partially solved by multi-start optimization, which essentially means that you run the simulation multiple times in order to find as many maxima as possible and then selecting the 'highest' maximum from among them. Note that this does not guarantee global optimality, as the global optimum might be hard to reach (i.e. the local optima have a larger domain of attraction).
Regarding the optimal step size for convergence: you might want to look at back-tracking linesearch. A short explanation of it can be found in the answer to this question
We might be able to give you more specific help if you could give us some code to look at, as jkalden already pointed out.

Getting displacement from accelerometer data with Core Motion

I am developing an augmented reality application that (at the moment) wants to display a simple cube on top of a surface, and be able to move in space (both rotating and displacing) to look at the cube in all the different angles. The problem of calibrating the camera doesn't apply here since I ask the user to place the iPhone on the surface he wants to place the cube on and then press a button to reset the attitude.
To find out the camera rotation is very simple with the Gyroscope and Core Motion. I do it this way:
if (referenceAttitude != nil) {
[attitude multiplyByInverseOfAttitude:referenceAttitude];
}
CMRotationMatrix mat = attitude.rotationMatrix;
GLfloat rotMat[] = {
mat.m11, mat.m21, mat.m31, 0,
mat.m12, mat.m22, mat.m32, 0,
mat.m13, mat.m23, mat.m33, 0,
0, 0, 0, 1
};
glMultMatrixf(rotMat);
This works really well.
More problems arise anyway when I try to find the displacement in space during an acceleration.
The Apple Teapot example with Core Motion just adds the x, y and z values of the acceleration vector to the position vector. This (apart from having not much sense) has the result of returning the object to the original position after an acceleration. (Since the acceleration goes from positive to negative or vice versa).
They did it like this:
translation.x += userAcceleration.x;
translation.y += userAcceleration.y;
translation.z += userAcceleration.z;
What should I do to find out displacement from the acceleration in some istant? (with known time difference). Looking some other answers, it seems like I have to integrate twice to get velocity from acceleration and then position from velocity. But there is no example in code whatsoever, and I don't think that is really necessary. Also, there is the problem that when the iPhone is still on a plane, accelerometer values are not null (there is some noise I think). How much should I filter those values? Am I supposed to filter them at all?
Cool, there are people out there struggling with the same problem so it is worth to spent some time :-)
I agree with westsider's statement as I spent a few weeks of experimenting with different approaches and ended up with poor results. I am sure that there won't be an acceptable solution for either larger distances or slow motions lasting for more than 1 or 2 seconds. If you can live with some restrictions like small distances (< 10 cm) and a given minimum velocity for your motions, then I believe there might be the chance to find a solution - no guarantee at all. If so, it will take you a pretty hard time of research and a lot of frustration, but if you get it, it will be very very cool :-) Maybe you find these hints useful:
First of all to make things easy just look at one axis e.g x but consider both left (-x) and right (+x) to have a representable situation.
Yes you are right, you have to integrate twice to get the position as function of time. And for further processing you should store the first integration's result (== velocity), because you will need it in a later stage for optimisation. Do it very careful because every tiny bug will lead to huge errors after short period of time.
Always bear in mind that even a very small error (e.g. <0.1%) will grow rapidly after doing integration twice. Situation will become even worse after one second if you configure accelerometer with let's say 50 Hz, i.e. 50 ticks are processed and the tiny neglectable error will outrun the "true" value. I would strongly recommend to not rely on trapezoidal rule but to use at least Simpson or a higher degree Newton-Cotes formula.
If you managed this, you will have to keep an eye on setting up the right low pass filtering. I cannot give a general value but as a rule of thumb experimenting with filtering factors between 0.2 and 0.8 will be a good starting point. The right value depends on the business case you need, for instance what kind of game, how fast to react on events, ...
Now you will have a solution which is working pretty good under certain circumstances and within a short period of time. But than after a few seconds you will run into trouble because your object is drifting away. Now you will enter the difficult part of the solution which I failed to handle eventually within the given time scope :-(
One promising approach is to introduce something I call "synthectic forces" or "virtual forces". This is some strategy to react on several bad situations triggering the object to drift away although the device remains fixed (? no native speaker, I mean without moving) in your hands. The most troubling one is a velocity greater than 0 without any acceleration. This is an unavoidable result of error propagation and can be handled by slowing down artificially that means introducing a virtual deceleration even if there is no real counterpart. A very simplified example:
if (vX > 0 && lastAccelerationXTimeStamp > 0.3sec) {
vX *= 0.9;
}
`
You will need a combination of such conditions to tame the beast. A lot of try and error is required to get a feeling for the right way to go and this will be the hard part of the problem.
If you ever managed to crack the code, pleeeease let me know, I am very curious to see if it is possible in general or not :-)
Cheers Kay
When the iPhone 4 was very new, I spent many, many hours trying to get an accurate displacement using accelerometers and gyroscope. There shouldn't have been much concern about incremental drift as device needed only move a couple of meters at most and the data collection typically ran for a few minutes at most. We tried all sorts of approaches and even had help from several Apple engineers. Ultimately, it seemed that the gyroscope wasn't up to the task. It was good for 3D orientation but that was it ... again, according to very knowledgable engineers.
I would love to hear someone contradict this - because the app never really turned out as we had hoped, etc.
I am also trying to get displacement on the iPhone. Instead of using integration I used the basic physics formula of d = .5a * t^2 assuming an initial velocity of 0 (doesn't sound like you can assume initial velocity of 0). So far it seems to work quite well.
My problem is that I'm using the deviceMotion.and the values are not correct. deviceMotion.gravity read near 0. Any ideas? - OK Fixed, apparently deviceMotion.gravity has a x, y, and z values. If you don't specify which you want you get back x (which should be near 0).
Find this question two years later, I just find a AR project on iOS 6 docset named pARk, It provide a proximate displacement capture and calculation using Gyroscope, aka CoreMotion.Framework.
I'm just starting leaning the code.
to be continued...

Maximum packing of rectangles in a circle

I work at a nanotech lab where I do silicon wafer dicing. (The wafer saw cuts only parallel lines) We are, of course, trying to maximize the yield of the die we cut. All the of die will be equal size, either rectangular or square, and the die are all cut from a circular wafer. Essentially, I am trying to pack maximum rectangles into a circle.
I have only a pretty basic understanding of MATLAB and an intermediate understanding of calculus. Is there any (relatively) simple way to do this, or am I way over my head?
Go from here, and good luck:
http://en.wikipedia.org/wiki/Knapsack_problem
and get here:
http://www-sop.inria.fr/mascotte/WorkshopScheduling/2Dpacking.pdf
At least you'll have some idea what are you tackling here.
I was fascinated to read your question because I did a project on this for my training as a Mathematics Teacher. I'm also quite pleased to know that it's thought to be an NP-problem, because my project was leading me to the same conclusion.
By use of basic calculus, I calculated the first few 'generations' of rectangles of maximum size, but it gets complex quite quickly.
You can read my project here:
Beckett, R. Parcels of Pi: A curve-packing problem. Bath Spa MEC. 2009.
Pages 1 - 15
Pages 16 - 30
I hope that some of my findings are useful to you or at least interesting. I thought that the application of this idea would most likely be in computer nano technology.
Kind regards.
Packing arbitrary rectangles into a circle to meet a space efficiency objective is a non-convex (NP-Hard) optimization in general. This means there will be no elegant or simple solution that will solve this problem optimally. The solution methods are all going to depend on any specific domain knowledge you can use to prune the search tree or develop heuristics. If you have no experience in this type of problem you should probably consult with an expert.
doesn't this resemble the Gauss's Circle Problem? See
http://mathworld.wolfram.com/GausssCircleProblem.html
or, this can be seen as a "packaging problem"
http://en.wikipedia.org/wiki/Packing_problem#Squares_in_circle

Calculation route length

I have a map with about 80 annotations. I would like to do 3 things.
1) From my current location, I would like to know the actual route distance to that position. Not the linear distance.
2) I want to be able to show a list of all the annotations, but for every annotation (having lon/lat) I would like to know the actual route distance from my position to that position.
3) I would like to know the closest annotation to my possition using route distance. Not linear distance.
I think the answer to all these three points will be the same. But please keep in mind that I don't want to create a route, I just want to know the distance to the annotation.
I hope someone can help me.
Best regards,
Paul Peelen
From what I understand of your post, I believe you seek the Haversine formula. Luckily for you, there are a number of Objective-C implementations, though writing your own is trivial once the formula's in front of you.
I originally deleted this because I didn't notice that you didn't want linear distance at first, but I'm bringing it back in case you decide that an approximation is good enough at that particular point of the user interaction.
I think as pointed out before, your query would be extremely heavy for google maps API if you perform exactly what you are saying. Do you need all that information at once ? Maybe first it would be good enough to query just some of the distances based on some heuristic or in the user needs.
To obtain the distances, you could use a Google Maps GDirections object... as pointed out here ( at the bottom of the page there's "Routes and Steps" section, with an advanced example.
"The GDirections object also supports multi-point directions, which can be constructed using the GDirections.loadFromWaypoints() method. This method takes an array of textual input addresses or textual lat/lon points. Each separate waypoint is computed as a separate route and returned in a separate GRoute object, each of which contains a series of GStep objects."
Using the Google Maps API in the iPhone shouldn't be too difficult, and I think your question doesn't cover that, but if you need some basic example, you could look at this question, and scroll to the answer.
Good Luck!
Calculating route distance to about 80 locations is certain to be computationally intensive on Google's part and I can't imagine that you would be able to make those requests to the Google Maps API, were it possible to do so on a mobile device, without being severely limited by either the phone connection or rate limits on the server.
Unfortunately, calculating route distance rather than geometric distance is a very expensive computation involving a lot of data about the area - data you almost certainly don't have. This means, unfortunately, that this isn't something that Core Location or MapKit can help you with.
What problem are you trying to solve, exactly? There may be other heuristics other than route distance you can use to approximate some sort of distance ranking.

iPhone CoreLocation: How to get the most accurate speed

I'm experimenting with adding the GPS functionality to my iPhone app. It's a workout app that will be used while walking or running. So what I want to use GPS for is to show the speed that the person is moving in Mph and minute/mile.
How should I configure the CLLocationManager so I get the best possible results? What should I set desiredAccuracy and distanceFilter?
I've tried with:
distanceFilter = 10 and desiredAccuracy = kCLLocationAccuracyNearestTenMeters
and reading
CLLocation.speed property
Testing while driving around in my car the accuracy seems good compared to the car speedometer although it takes a while to update. I realize that the update delay may very well be the time it takes to query the GPS location, but I'm not sure if changing the above two parameters would give better results.
Should I use kCLLocationAccuracyBest and some other value for distanceFilter?
I'm interested to hear from others using CoreLocation to get speed. What are you doing to get more accurate results?
For best results, you should use kCLLocationAccuracyBest. What you put into your distance filter depends on up with which faults you're willing to put. Basically, you're going to have to make decisions based on accuracy vs availability. That is, during periods when a best-accuracy answer is not available, what will you display?
One approach is to let the phone deliver less-accurate answers and, using a projection of what was happening the last time you had best-accuracy information, see if what you have makes sense.
That is, suppose I'm jogging at 6mph to the North. You plot me along point-A, point-B, point-C... then you get a low-accuracy answer (maybe kCLLocationAccuracyNearest100Meters.) Look at the spot where it says I am and figure out "could I have gotten to that spot from point-C if I'd continued along my current path, making reasonable adjustments for possible changes in speed?" If so, then the new point is within the realm of possibility. (If not, then toss it out.) Then project from point-C at my last-known speed and figure out where you think I probably am, ballistically. Save that as ballistic-point-D.
Of course, you're using the accelerometer to get some sort of inertial sense of which way I went, right? So, you can't know direction (you don't know what way the phone is pointing), but you can make a reasonable stab at distance.
Using all this information, plot the most likely spot where you think I probably am.
NOTE: When testing, don't just drive in good-cell coverage areas. See how your app performs out in the hills, away from cell phones. A lot of people like to bike & jog those areas!
Disclaimer: I've only played with CoreLocation a bit, I've not tested the accuracy very closely.
I'd expect that you'd get the most accurate results by using the defaults for distanceFilter and desiredAccuracy. Less-frequent updates are only going to give you less data to work with.
One issue you're likely to run into is when the location fix is lost for a while, then comes back. The naive, connect-the-dots approach to figuring out distance traveled is going to tend to under-estimate the actual speed of the runner. Rather than using CLLocation.speed, you might get better results calculating speed based on some heuristic approximation to the line the runner is actually following.