Whate are the basic concepts for implementing anti-shock and anti-shake algorithms? - iphone

I have some animations happening upon fine acceleration detections. But when the user sits in a car or is walking it may get annoying.
Basically, all that stuff has to be disabled automatically as soon as there is too much vibration or shaking. Conceptually, I think that it's very hard to filter those vibrations out , since the "vibration phase" changes permanently. I woul define "unwanted vibration or shocks" as acceleration values that change very fast by an large interval of values, or, an permanently changing accumulated value that does not exceed an specified treshold range in an specified minimum period of time.
I am looking for "proven" concepts, before I start reinventing the wheel for a couple of days.

I don't have any concrete answers for you, but you might want to Google band-pass filters or anti-aliasing filters for some ideas on how to approach this. Basically, if you can identify the frequency range of accelerations that you want to consider real, you can filter out frequencies that fall outside this range.

Before you start doing too much pre-optimization, I think you should implement a low pass filter and see if that does the job. Most iPhone apps effectively use a variation of an LPF to get rid of unwanted accelerometer noise.
You could also go the other way and use a high pass filter. Once you get a certain power level passing through the HPF, stop processing data.

Related

Does OptaPlanner have a "built-in" way to perform multi-unit score normalization?

At the moment, my problem has four metrics. Each of these measures something entirely different (each has different units, a different range, etc.) and each is weighted externally. I am using Drools for scoring.
I only have only one score level (SimpleLongScore) and I have to find a way to appropriately combine the individual scores of these metrics onto one long value
The most significant problem at the moment is that the range of values for the metrics can be wildly different.
So if, for example, after a move the score of a metric with a small possible range improves by, say, 10%, that could be completely dwarfed by an alternate move which improves the metric with a larger range's score by only 1% because OptaPlanner only considers the actual score value rather than the possible range of values and how changes affect them proportionally (to my knowledge).
So, is there a way to handle this cleanly which is already part of OptaPlanner that I cannot find?
Is the only feasible solution to implement Pareto scoring? Because that seems like a hack-y nightmare.
So far I have code/math to compute the best-possible and worst-possible scores for a metric that I access from within the Drools and then I can compute where in that range a move puts us, but this also feel quite hack-y and will cause issues with incremental scoring if we want to scale non-linearly within that range.
I keep coming back to thinking I should just just bite the bullet and implement Pareto scoring.
Thanks!
Take a look at #ConstraintConfiguration and #ConstraintWeight in the docs.
Also take a look at the chapter "explaning the score", which can exactly tell you which constraint had which score impact on the best solution found.
If, however, you need pareto optimization, so you need multiple best solutions that don't dominate each other, then know that OptaPlanner doesn't support that yet, but I know of 2 cases that implemented it in OptaPlanner by hacking BestSolutionRecaller.
That being said, 99% of the cases that think of pareto optimization, are 100% happy with #ConstraintWeight instead, because users don't want multiple best solutions (except during simulations), they just want one in production.

AnyLogic: How to set pedestrian comfortable speed beyond the range

I got the problem of the pedestrian library. I would like to use the pedestrian library to simulate the behavior of the vehicle, therefore, I would like to set the comfortable speed of the agent to be, say 70km/h. However, there is an error saying the speed has to be within [0mps, 10mps]. Is there any way to set whatever speed I want?
Thank you,
Jiannan
First why don't you use the road traffic library? But you must have your reasons.
You can't change the maximum speed value, but what you can do, which is not a very elegant solution, but a solution nonetheless, is to change the time scale of your model. There is no direct option for that, so you basically have to create your own clock that runs your way.
So for instance, if you want your car to move at 20mps, you choose 10 mps as the comfortable speed and you have the transformation customTime(time())=0.5*time(), where custom time is your own function.
So every time 1 second passes in the simulation, in your custom clock only 0.5 seconds will pass... You just have to take care of the statistics you are collecting so they fit your time scale.
Also, you will have to change everything according to your clock in your model, because Anylogic will continue showing you things in normal seconds.
Hope that helps :)

Processing accelerometer data

I would like to know if there are some libraries/algorithms/techniques that help to extract the user context (walking/standing) from accelerometer data (extracted from any smartphone)?
For example, I would collect accelerometer data every 5 seconds for a definite period of time and then identify the user context (ex. for the first 5 minutes, the user was walking, then the user was standing for a minute, and then he continued walking for another 3 minutes).
Thank you very much in advance :)
Check new activity recognization apis
http://developer.android.com/google/play-services/location.html
its still a research topic,please look at this paper which discuss the algorithm
http://www.enggjournals.com/ijcse/doc/IJCSE12-04-05-266.pdf
I don't know of any such library.
It is a very time consuming task to write such a library. Basically, you would build a database of "user context" that you wish to recognize.
Then you collect data and compare it to those in the database. As for how to compare, see Store orientation to an array - and compare, the same holds for accelerometer.
Walking/running data is analogous to heart-rate data in a lot of ways. In terms of getting the noise filtered and getting smooth peaks, look into noise filtering and peak detection algorithms. The following is used to obtain heart-rate information for heart patients, it should be a good starting point : http://www.docstoc.com/docs/22491202/Pan-Tompkins-algorithm-algorithm-to-detect-QRS-complex-in-ECG
Think about how you want to filter out the noise and detect peaks; the filters will obviously depend on the raw data you gather, but it's good to have a general idea of what kind of filtering you'd want to do on your data. Think about what needs to be done once you have filtered data. In your case, think about how you would go about designing an algorithm to find out when the data indicates activity (like walking, running,etc.), and when it shows the user being stationary. This is a fairly challenging problem to solve, once you consider the dynamics of the device itself (how it's positioned when the user is walking/running), and the fact that there are very few (if not no) benchmarked algos that do this with raw smartphone data.
Start with determining the appropriate algorithms, and then tackle the complexities (mentioned above) one by one.

Getting displacement from accelerometer data with Core Motion

I am developing an augmented reality application that (at the moment) wants to display a simple cube on top of a surface, and be able to move in space (both rotating and displacing) to look at the cube in all the different angles. The problem of calibrating the camera doesn't apply here since I ask the user to place the iPhone on the surface he wants to place the cube on and then press a button to reset the attitude.
To find out the camera rotation is very simple with the Gyroscope and Core Motion. I do it this way:
if (referenceAttitude != nil) {
[attitude multiplyByInverseOfAttitude:referenceAttitude];
}
CMRotationMatrix mat = attitude.rotationMatrix;
GLfloat rotMat[] = {
mat.m11, mat.m21, mat.m31, 0,
mat.m12, mat.m22, mat.m32, 0,
mat.m13, mat.m23, mat.m33, 0,
0, 0, 0, 1
};
glMultMatrixf(rotMat);
This works really well.
More problems arise anyway when I try to find the displacement in space during an acceleration.
The Apple Teapot example with Core Motion just adds the x, y and z values of the acceleration vector to the position vector. This (apart from having not much sense) has the result of returning the object to the original position after an acceleration. (Since the acceleration goes from positive to negative or vice versa).
They did it like this:
translation.x += userAcceleration.x;
translation.y += userAcceleration.y;
translation.z += userAcceleration.z;
What should I do to find out displacement from the acceleration in some istant? (with known time difference). Looking some other answers, it seems like I have to integrate twice to get velocity from acceleration and then position from velocity. But there is no example in code whatsoever, and I don't think that is really necessary. Also, there is the problem that when the iPhone is still on a plane, accelerometer values are not null (there is some noise I think). How much should I filter those values? Am I supposed to filter them at all?
Cool, there are people out there struggling with the same problem so it is worth to spent some time :-)
I agree with westsider's statement as I spent a few weeks of experimenting with different approaches and ended up with poor results. I am sure that there won't be an acceptable solution for either larger distances or slow motions lasting for more than 1 or 2 seconds. If you can live with some restrictions like small distances (< 10 cm) and a given minimum velocity for your motions, then I believe there might be the chance to find a solution - no guarantee at all. If so, it will take you a pretty hard time of research and a lot of frustration, but if you get it, it will be very very cool :-) Maybe you find these hints useful:
First of all to make things easy just look at one axis e.g x but consider both left (-x) and right (+x) to have a representable situation.
Yes you are right, you have to integrate twice to get the position as function of time. And for further processing you should store the first integration's result (== velocity), because you will need it in a later stage for optimisation. Do it very careful because every tiny bug will lead to huge errors after short period of time.
Always bear in mind that even a very small error (e.g. <0.1%) will grow rapidly after doing integration twice. Situation will become even worse after one second if you configure accelerometer with let's say 50 Hz, i.e. 50 ticks are processed and the tiny neglectable error will outrun the "true" value. I would strongly recommend to not rely on trapezoidal rule but to use at least Simpson or a higher degree Newton-Cotes formula.
If you managed this, you will have to keep an eye on setting up the right low pass filtering. I cannot give a general value but as a rule of thumb experimenting with filtering factors between 0.2 and 0.8 will be a good starting point. The right value depends on the business case you need, for instance what kind of game, how fast to react on events, ...
Now you will have a solution which is working pretty good under certain circumstances and within a short period of time. But than after a few seconds you will run into trouble because your object is drifting away. Now you will enter the difficult part of the solution which I failed to handle eventually within the given time scope :-(
One promising approach is to introduce something I call "synthectic forces" or "virtual forces". This is some strategy to react on several bad situations triggering the object to drift away although the device remains fixed (? no native speaker, I mean without moving) in your hands. The most troubling one is a velocity greater than 0 without any acceleration. This is an unavoidable result of error propagation and can be handled by slowing down artificially that means introducing a virtual deceleration even if there is no real counterpart. A very simplified example:
if (vX > 0 && lastAccelerationXTimeStamp > 0.3sec) {
vX *= 0.9;
}
`
You will need a combination of such conditions to tame the beast. A lot of try and error is required to get a feeling for the right way to go and this will be the hard part of the problem.
If you ever managed to crack the code, pleeeease let me know, I am very curious to see if it is possible in general or not :-)
Cheers Kay
When the iPhone 4 was very new, I spent many, many hours trying to get an accurate displacement using accelerometers and gyroscope. There shouldn't have been much concern about incremental drift as device needed only move a couple of meters at most and the data collection typically ran for a few minutes at most. We tried all sorts of approaches and even had help from several Apple engineers. Ultimately, it seemed that the gyroscope wasn't up to the task. It was good for 3D orientation but that was it ... again, according to very knowledgable engineers.
I would love to hear someone contradict this - because the app never really turned out as we had hoped, etc.
I am also trying to get displacement on the iPhone. Instead of using integration I used the basic physics formula of d = .5a * t^2 assuming an initial velocity of 0 (doesn't sound like you can assume initial velocity of 0). So far it seems to work quite well.
My problem is that I'm using the deviceMotion.and the values are not correct. deviceMotion.gravity read near 0. Any ideas? - OK Fixed, apparently deviceMotion.gravity has a x, y, and z values. If you don't specify which you want you get back x (which should be near 0).
Find this question two years later, I just find a AR project on iOS 6 docset named pARk, It provide a proximate displacement capture and calculation using Gyroscope, aka CoreMotion.Framework.
I'm just starting leaning the code.
to be continued...

iPhone CoreLocation: How to get the most accurate speed

I'm experimenting with adding the GPS functionality to my iPhone app. It's a workout app that will be used while walking or running. So what I want to use GPS for is to show the speed that the person is moving in Mph and minute/mile.
How should I configure the CLLocationManager so I get the best possible results? What should I set desiredAccuracy and distanceFilter?
I've tried with:
distanceFilter = 10 and desiredAccuracy = kCLLocationAccuracyNearestTenMeters
and reading
CLLocation.speed property
Testing while driving around in my car the accuracy seems good compared to the car speedometer although it takes a while to update. I realize that the update delay may very well be the time it takes to query the GPS location, but I'm not sure if changing the above two parameters would give better results.
Should I use kCLLocationAccuracyBest and some other value for distanceFilter?
I'm interested to hear from others using CoreLocation to get speed. What are you doing to get more accurate results?
For best results, you should use kCLLocationAccuracyBest. What you put into your distance filter depends on up with which faults you're willing to put. Basically, you're going to have to make decisions based on accuracy vs availability. That is, during periods when a best-accuracy answer is not available, what will you display?
One approach is to let the phone deliver less-accurate answers and, using a projection of what was happening the last time you had best-accuracy information, see if what you have makes sense.
That is, suppose I'm jogging at 6mph to the North. You plot me along point-A, point-B, point-C... then you get a low-accuracy answer (maybe kCLLocationAccuracyNearest100Meters.) Look at the spot where it says I am and figure out "could I have gotten to that spot from point-C if I'd continued along my current path, making reasonable adjustments for possible changes in speed?" If so, then the new point is within the realm of possibility. (If not, then toss it out.) Then project from point-C at my last-known speed and figure out where you think I probably am, ballistically. Save that as ballistic-point-D.
Of course, you're using the accelerometer to get some sort of inertial sense of which way I went, right? So, you can't know direction (you don't know what way the phone is pointing), but you can make a reasonable stab at distance.
Using all this information, plot the most likely spot where you think I probably am.
NOTE: When testing, don't just drive in good-cell coverage areas. See how your app performs out in the hills, away from cell phones. A lot of people like to bike & jog those areas!
Disclaimer: I've only played with CoreLocation a bit, I've not tested the accuracy very closely.
I'd expect that you'd get the most accurate results by using the defaults for distanceFilter and desiredAccuracy. Less-frequent updates are only going to give you less data to work with.
One issue you're likely to run into is when the location fix is lost for a while, then comes back. The naive, connect-the-dots approach to figuring out distance traveled is going to tend to under-estimate the actual speed of the runner. Rather than using CLLocation.speed, you might get better results calculating speed based on some heuristic approximation to the line the runner is actually following.