What algorithm can I use to count steps using Accelerometer and Gyroscope? - accelerometer

I need to count steps using data from Accelerometer and Gyroscope from a cellphone.
What algorithm can I use considering I have these data? Where my Accelerometer describes the velocity of the device, but don't include gravity. Example: (x: 0.3, y: -1.2, z: 0.2). And Gyroscope describes the rotation of the device. Example: (x: 0.0, y: -0.1, z: -0.1).
I can get these data every 1 millisecond.
I tried this approach but it didn't work well: https://programmerworld.co/android/how-to-create-walking-step-counter-app-using-accelerometer-sensor-and-shared-preference-in-android/
Pedometer plugins don't work for me, because I need to calculate the time for each step.

Related

Continuously rotate a graphic at a rate that matches compass rotation data

I'm building an iOS application that needs to smoothly rotate an image based on incoming compass heading data (actually, GPS ground track data arriving at 10 Hz from an aviation GPS system). I can easily animate rotation to the new desired angle with the correct duration (0.1 seconds) but the effect is still rough - you can tell that it's actually a series of discrete animations rather than one fluid motion. Here's the current very simple animation code:
// rotate the compass ring
if let cv = self.compassView {
UIView.animate(withDuration: 0.1, animations: {
cv.transform = CGAffineTransform(rotationAngle: CGFloat(newAngle))
})
}
Does anyone have a good working algorithm for fluidly spinning things like this? Some clever means of creating a single animation (perhaps one that assumes a full 360) the speed of which is then modified as the rate of rotation changes? Apple's own compass app appears to do something similar.

iPhone accelerometer, how to filter "fallback" values?

I'm playing with accelerometer now. The main idea is to move cursor using phone. I'm just accumulate values (deltas) that accelerometer gives me every tick to x and y position.
The problem is that accelerometer sends parasite values when I stop moving phone. For example:
When i move phone right, i have sequence like:
1, 2, 5, 3, 5,
Then I stop moving. Accelerometer still produces values, but in opposite direction, like:
-3, -2, -2, -1
An than it generates zeroes.
How can i avoid values, that moves me to the reverse direction, without any phone motion?
That's how accelerometers work...
Don't use these values directly.
Integrate (=sum) them, you'll get a speed.
Integrate the speeds, you'll get positions.
It needs to be heavily filtered, though, because of drift, and because of the constant -9.8m/s^-2 of gravity (the accelerometer reports both "manual" acceleration and "earth" acceleration, and separting them is not easy)
EDIT :
Drift can be corrected by detecting a rest position, and manually stopping the move (low velocity -> force 0 velocity)
Gravity can be removed by detecting the gravity direction at rest, then removing this vector from the next ones. This only works if the phone isn't rotated too much. This can be correct using a gyroscope, but you have to be sure that the gyroscope's values aren't computed from the accelerometer like on some phones.

Mapping outside quaternions to Unity

I'm trying to map rotations from a sensor into Unity using Quaternions, and I cannot
seem to figure out why rotations do not map correctly.
I'm using the Adafruit BNO055 to pull absolute orientation in the form of Quaternions. The
source for their Quaternion implementation can be found here. From what I understand about
Quaternions, which is almost nothing, I should be able to pull a Quaternion out of the sensor and
pump it into any GameObject inside Unity so that they share the same orientation. If I had a loop
set up that read the Quaternion data from the sensor and pumped it into Unity, the GameObject should
rotate exactly like the sensor in the physical world. Unfortunately, this is not happening.
An example of the data sent from the sensor to Unity
w: 0.903564
x: 0.012207
y: 0.009094
z: -0.428223
Is the Quaternion sent from the sensor not equal to the Quaternions used in Unity? If not, how
would I go about getting these mapped correctly?
Thanks in advance for any help!
When I have created a Quaternion from an external sensor in a comma separated list, this works for me:
parts = SensorReceivedText.Split(',');
float x = Convert.ToSingle(parts[0]);
float y = Convert.ToSingle(parts[1]);
float z = Convert.ToSingle(parts[2]);
float w = Convert.ToSingle(parts[3]);
Quaternion rotation = new Quaternion(x, y, z, w);
Just for example of diffrent conversions (including Unity and OpenGL)
https://developers.google.com/project-tango/overview/coordinate-systems
I don't know your device and coordinates notation, but you can recover it making some experiments with orientation.
The main problem of conversion, that conversion matrix may contain MIRRORING (-1 matrix component). And can't be solved just rearanging rotation axes.

Sprite kit impulse at point

I'm trying to use two points on a spaceship to create a "tank like movement" were the space ship will move forward when both points get an impulse on the physicsBody (using the applyImpulse(CGVector, atPoint: CGPoint) method) and turn left/right when only one point gets an impulse.
I can move the space ship forward using the applyImpulse(CGVector) method without the atPoint parameter and it works, but i have issues using the atPoint parameter.
When trying to use the atPoint parameter the spaceship will move really randomly, even if i apply an impulse to both jet engine 1 and jet engine 2, but i'm not sure if i apply the forces at the right point (on the jet engine spots marked on the image), the documentation is very vague about this and i don't know how to get the position of the impulses right.
Does anyone know how to get the correct positions to apply the impulses to? It should be like the ones on the image. I'm using Swift but i can read Objective C so that doesn't really matter.
EDIT:
I have tried the following points:
CGPoint(x: 0, y: 0) and CGPoint(x: 1, y: 0)
CGPoint(x: 0, y: 0) and CGPoint(x: spaceship.size.width, y: 0)
Reviewing the documentation, the points you apply the impulses at must be in scene coordinates. What that means, is you need to determine the points in the coordinates of your sprite (probably coordinates like CGPoint(x: 0, y: 1) and CGPoint(x: 0, y: -1), but you may need to play with those), then convert those to scene coordinates using convertPoint(point, fromNode: ship).
A way to do this that's (possibly) going to give you a more "realistic" drag and thrust result is hanging two engine bodies off your fighter jet, and giving them both drag properties, and apply thrust from them when needed, as needed.
In this way you'll get the type of rotation common to one engine being on whilst the other is off, etc.

Distance moved by Accelerometer

I want to move objects on iPhone screen (rectangle, circle and so on) by moving iPhone.
For example, I move iPhone along X-axis and the object moves along X-axis. The same for Y,Z-axis.
How can I do this?
Can I get algorithm for it?
Thank you.
P.S:
I looked for a while and seems like it is possible using accelerometer.
You get position by integrating the linear acceleration twice but the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
However the gyro mouse might work for your application, see between 37:00-38:25 in the video.
Similar questions:
track small movements of iphone with no GPS
What is the real world accuracy of phone accelerometers when used for positioning?
how to calculate phone's movement in the vertical direction from rest?
iOS: Movement Precision in 3D Space
How to use Accelerometer to measure distance for Android Application Development
How can I find distance traveled with a gyroscope and accelerometer?
Not easy to do accurately. An accelerometer only reports acceleration, not movement. You can try (numerically) integrating the acceleration over time, but you're likely to end up with cumulative errors adding up and leading to unintended motion.
Pseudocode, obviously untested:
init:
loc = {0, 0, 0} ; location of object on screen
vel = {0, 0, 0} ; velocity of object on screen
t = 0 ; last time measured
step:
t0 = time ; get amount of time since last measurement
dt = t0 - t
accel = readAccelerometer()
vel += accel * dt ; update velocity
loc += vel * dt ; update position
displayObjectAt(loc)
Why don't you use the acceleration itself instead distance moved? For example;
If your acceleration is 1.4G east, your object should move to east 14pixels per second until your acceleration changes.
So that object's movement is will be like it is under a relative force and I think it's more realistic. Faster moving device, bigger force to effect object.
Also look:
http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
and
Using accelerometer, gyroscope and compass to calculate device's movement in 3D world