Finding relative position using 9 Axis Sensor - accelerometer

What is the best way to find the position relative to a starting point using a 9-Axis Adafruit BNO055 Absolute Orientation Sensor?
Ideally I would be able to map this location in a coordinate grid of my environment.
Sidenote
I am able to monitor the power I am using in my thrusters (I am doing this for an underwater ROV). Would it be simpler for me to use this data to find my position than the sensor? Or would a combination of both be the best?

Related

Mapbox Unity SDK: storying and displaying short relative distances

I was wondering how I can go about storing and displaying small, but geographically accurate distances in the mapbox unity SDK?
I'm storing radius' about markers on a map, I get the value in meters (from ~0.5m-10m), and then, adaptively with the zoom level, I want to accurately display those meters in Unity world space (draw an ellipse) using these stored values. The problem is that the mapbox api from my understanding only lets you to convert lat/long to unity world coordinates and I'm running into precision errors. I can get adequate precision when using the CheapRuler class and meters, but as soon as I use the _map.GeoToWorld(latlon) method the precision is lost.
How would I go about keeping adequate precession, is there a way I can use the marker as the reference point and the radius as the offset, and get the relative unity world coordinate distance (of the radius) that way? I know you can also store scale relative to the mapbox tiles, but I'm not sure how I can convert that back to a unity world distance. I'm operating on very small distances, so any warping due to lat/long being a Mercator projection can probably be ignored.
I figured out a round-about solution.
First I convert the meters into unity world space using whatever IMapScalingStrategy Mapbox is currently using.
Then I convert from world to the view space of whatever camera I want to scale to the given bounds.
After that, I use find out the scale of the bounds, solving for:
UnityRelativeScaleChange = 2Map Zoom Level Change; which (to my estimations) is the relationship between unity scale and mapbox zoom levels.
This solutions works great as long as you don't have to zoom in/out by too much, otherwise you'll run into precision problems as the functions rely on the relative view-based size of a given bounds to do their calculations which will lead to unstable results if those initially take a tiny portion of the screen.

Moving around the surface of an Earth shaped spheroid in Unity

I'm trying to make a Unity game that allows the user to explore the surface of an Earth shaped spheroid, based on WGS84.
The project so far is on Github, and there's a YouTube video of this behaviour.
A shape the size of Earth is way too big for Unity, so I just spawn tiles near the user, offset so that the first tile is at Unity's origin point. This bit works.
The issue is moving around. I've been using an approach where I get the user's position in ECEF coordinates, then normalise that to provide the global orientation for the player, then I translate the player forward based on that and their rotation.
The issue with this is that normalising the ECEF coordinate means that the player is moving in a spherical shape, but the WGS84 spheroid is not perfectly spherical. So the player sinks into the floor, or flies up if you got south or north, respectively.
My question is, how can I allow the user to move around the surface of the spheroid by way of translation? I feel like there might be some way of taking the major/minor axis of the spheroid into account as the player moves, but I'm not sure how to do that.
I have no experience with Unity or computer graphics, I'm approaching it purely from the navigation point of view.
Let's look at the real world.
We want to travel either by walking/driving on the surface or flying at some altitude. When we do it, we move in the local coordinate system (North-South, East-West, Up-Down), we can't see any curvature. We assume the Earth is flat.
The problem arises when we try to do it on a computer, which is ruthlessly precise and knows the shape of the Earth. We can't assume the Earth is flat, we can't assume the Earth is a sphere. The Earth is a geoid. Fortunately for some purposes we can simplify things and assume the Earth is an ellipsoid. You chose WGS84. Good!
So how to move around an ellipsoid? Solving the problem analitically is a nightmare. We have to cheat ;)
We should assume te Earth is flat for a moment, make a move in a chosen direction in the local coordinate system, write down the altitude of the new position, calculate the global geodetic coordinates (Lat, Long, Alt) of that new point and then replace the altitude with the one obtained while using the local coordinate system. In other words: each time we move forward along a perfectly straight line and diverge from the ellipsoid (just a tiny bit), we force the altitude not to change in relation to the ellipsoid.
Implementation.
You need to be able to freely translate coordinates between geodetic (Lat, Long, Alt) and ECEF. Going from geodetic to ECEF is easy. Finding geodetic coordinates for a given ECEF position is much more complex, there are many different algorithms, I'm sure you should be able to find a ready to use implementation somewhere.
What you also need is Local Tangent Plane, and to be precise, you are going to use NED.
Let's assume your object is initially at some geodetic position. You write down the altitude (relative to the ellipsoid). Then you create a local NED coordinate system with its origin at that point. Then you move the object in that local coordinate system. You write down how much the altitude (or rather the Down coordinate) changed. Then you must calculate the ECEF coordinates of that new position and transform it to geodetic (Lat, Long, Alt). You have the old altitude, you have the altitude change in the NED coordinates, which means you know the new altitude. You then apply that altitude to your new geodetic coordinates (brutally replace the Alt in Lat/Long/Alt with a new value).
Then you make another move in the NED coordinates defined for that new position. And so on...
I'm not sure if it is clear, the process is quite complicated. If you can't understand - shout :)

iOS is it possible to convert CLLocation into some sort of XYZ metric coordinate system?

I'm building an augmented reality game, and working with CLLocation is rather cumbersome.
Is there some way to locally approximate CLLocation as XYZ coordinate, expressed in meters with the origin starting at some arbitrary point (for example the initial position when the game was started)?
Lets say I'm working with a 1 mile radius and do not really care about the curvature of the earth. Is it possible to approximate or somehow simplify the location based calculations for local position tracking?
Alternatively, is there a coordinate system that can be used with CLLocation that also incorporates the roll, pitch, yaw of the CMAttitude as well as compass orientation?
Clarification: As far as I understand, the problem with latitude and longitude is that their units vary in size, depending on the position on the globe. I should've specified that X,Y,Z should be in standard units, like meters or feet.
Thank you!
The Haversine formula may be useful.
I found a good article on it at http://www.jaimerios.com/?p=39 with code examples.
You could get the initial point at the app's launch and calculate the relative points based on the user coordinates as he or she moves. Admittedly, this is not super elegant, but if you are just trying to do some simple comparisons based on the user's location relative to an arbitrary origin, this should work. For the Z, Alex Stone's suggestion of calculating it based on the altitude should be fine.

How to determine relative position using accelerometer and gyro data

I am designing a robot, and need to track the distance and direction of the robot motion, Nothing in 3D, I only need x,y and angle in x y plane.
My question :
Is it possible to use gyro and accelerometer with kalman filtering or any other methods to
track this? (I do not have motor encoders)
My constraints : I do not have space to include a gps (due to power requirements)
or motor encoders (due to motor support)
No, not really. If you integrate the accelerometer values twice you get position but the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20.
A related question is probably this.

How to determine absolute orientation

I have a xyz accelerometer and magnetometer. Now I want to determine the orientation of the device using both. The problem I see is that depending on the device orientation, I'd need to use the sensors in different order.
Let me give an example. If I have the device facing me then changes in both the roll and pitch can be determined with the accelerometer. For yaw I use the magnetometer.
But if I put the device horizontally (ie. turn it 90ยบ, facing the ceiling) then any change in the up vector (now horizontal) isn't notice, as the accelerometer doesn't detect any change. This can now be detected with the magnetometer.
So the question is, how to determine when to use one or the other. Is this enough with both sensors or do I need something else?
Thanks
The key is to use the cross product of the two vectors, gravity and magnetometer. The cross product gives a new vector perpendicular to them both. That means it is horizontal (perpendicular to down) and 90 degrees away from north. Now you have three orthogonal vectors which define orientation. It is a little ugly because they are not all perpendicular but that is easy to fix. If you then cross this new vector back with the gravity vector that gives a third vector perpendicular to the gravity vector and the magnet plane vector. Now you have three perpendicular vectors which defines your 3D orientation coordinate system. The original accelerometer (gravity) vector defines Z (up/down) and the two cross product vectors define the east/west and north/south components of the orientation.
Here is some documentation that walks through this project. As is clear from other answers, the math can be tricky.
http://www.freescale.com/files/sensors/doc/app_note/AN4248.pdf
I think the question "how to determine when to use one or the other" is misguided. You should always use both sensors for orientation. There are cases where one of them is useless. However, these are edge cases.
If I understand you correctly, you'll need something to detect pitch (tilting) and orientation according to the cardinal points (North, East, South and West).
The pitch can be read from the accelerometer.
The orientation according to the cardinal points can be read from a compass.
Combining the output from these two sensors correctly with the right math in your software will most likely give you the absolute orientation.
I think it's doable that way.
Good luck.
In the event you still need absolute orientation you can check this break out board from Adafruit: https://www.adafruit.com/products/2472. The nice thing about this is board is that it has an ARM Cortex-M0 processor to do all of the calculations for you.