I am trying to create an application with AR that will display objects hovering right on top of the drone to help the user flying experience (e.g. show an arrow or gimbals on your AR-goggles that follow the drone and shows where the drone is and its attitude when it is far away).
For this I need the X and Y coordinates of the drone to be able to place the arrow on top of the drone when the drone moves around. However the GPS coordinates returned by the drone through the SDK are only accurate to 1.5m or worse.
Is there any better way to determine the drones position? Or retrieve some x and y values through the dji mobile sdk?
I am using a DJI Mini 2 and Micrososft HoloLens2 goggles.
Thank you in advance.
Nop you can't get better position info.
You need a RTK drone to do that.
I don't understand why you cant do with 2m accuracy? If it's 50meter away, 1m error wont do that much?
Related
The new Google Maps app allows you to select a destination, and then go into a mode where the map is constantly rotated in the direction you're facing, and it never asks you to calibrate the compass. If you spin in a circle, the map will spin with you, accurately rotating the map so that the direction you're facing always correlates with "up" on the map. As far as I know, this wasn't possible before. How do they do this?
The compass requires you to calibrate when you open it, but for some reason Google maps does not.
Modern phones have sensors, Accelerometer and Gyroscope, which are used to detect changes in its orientation, dimension, and acceleration (maybe other stuff too). It probably processes the information from the sensors and orients the map accordingly. I believe Compass Mode has been in Google Maps for a while though.
I have an idea for an iPhone game / app that needs to be able to track height position of the iPhone. I am new to iPhone development so I don't know how the accelerometer works. But the idea is that the user should place the iphone on a flat surface (with the iPhone back against the surface). The user will the lower and raise the surface periodically and the iPhone should be able to track this movement. We can assume that the surface will go back to its original position so we only care about how much it was lowered / raised from its original position during the movement.
The amount raised / lowered will be a few centimeters. Is this possible to track and how would you go about solving this?
Thank you very much for your help!
Best regards,
Lukas
This is not possible to track directly. However, the accelerometer data can be used to sort of do that. Acceleration is the time-derivative of speed, which is the time-derivative of position. By integrating the acceleration twice, you can track position.
Caveat though: this will probably not be very accurate, with significant drift errors.
Now you can also track orientation with the magnetometer, and you can use the camera to "watch" the environment. This suggests the possibility to fix the position by triangulation.
I don't expect that to be easy though.
Need to make an app in which user can find the walking distance between two points.
The Concept is like user will start the app and start walking and after taking a few steps he will click some button which will show him the distance traveled by him from the point where he just started the App. and to where he stopped.
I know we can find the distance between two location by using CLLocation Class but the challenge is to get the accurate measurements up to say 3 meters.
Even I am not sure that we can use accelerometer as If I'm at rest, the accelerometer detects only acceleration due to gravity. This obviously set the walking distance to 0 so dont know how will I detect the starting point.
Any hint/suggestion on the same would be a great help to me.
Check this: iphone accelerometer speed and distance
As far as I know it is almost impossible to get accurate results about distance, using the accelerometer.
I am using Xbox-Unity and am trying to make a Kinect game. I need to be able to know when a player's foot is in the air and when it comes back down on the ground. I thought that this would be as simple as tracking the Joint Positions but the foot's Y changes based on the proximity to the Kinect Camera (Taking the foot joint position from Kinect). If I lifted my left foot up far away from the camera, it's Y would be high(let's say 10). If it were to land close to the camera, the Y would be low(let's say -20). What I had hoped was that I could just say 0 is the floor and have an easy time knowing when a foot was in the air and when it was on the ground. Does anybody have any ideas on how I can correctly tell when a foot is grounded?(everything I can think of so far had at least one exception that would make the gameplay broken)
Edit: Used a point to plane equation but no matter what I do, the distance to floor is always different based on my proximity to the camera.
One possibility would be to compare it to the other foot.. if one is higher than the other, chances are they're standing on the other foot. If you're looking to detect jumps, you should be able to find a sudden change in the y position of both feet.
There's also the Floor Clipping Plane, but that involves some more complicated math from what I've seen. Check out the Kinect programming guide, which is super old but I think should still be relevant here. The section "Floor Determination" is what you're after.
I have been doing a bit of research, but I cannot seem to find a way to determine small distances (centimeters and meters) using the sensors in Android or iOS devices.
Bluetooth appears too inaccurate and require more than one device, GPS only works over larger variations in distance, and small variations in rotation seem to make using the accelerometer nearly impossible.
Is there a method that I am unaware of that would allow me to do such a thing? I am familiar with Calculus, so using Integrals to determine distance based on changes in time and velocity/ acceleration is not a problem for me, I just do not know how to determine those things.
Thank you.
There's no sensor in these devices which is able to give you the desired accuracy without exterior help.
If your use case allows for a bit of external setup, here are some ideas:
You could use the camera and computer vision to calculate device movement. You could, for example, use ARToolkit to measure the distance to a visual tag fixed to a wall. In close distances you can get pretty high accuracy (mm) using this technique.
Another idea would be to measure the distance to a solid object, like a wall, by emitting a short audio signal using the speaker and measure the time until the echo arrives at the microphone. This would be more of a research project, though.
You CAN use the accelerometer to measure distance travelled
(if ONLY absolute displacement is involved).
Have the user hold the device flat and walk from pointA to pointB.
The user presses a "Start" button in ur app as he starts from A and
presses an "End" button in ur app as he reaches B.
Calculate the double-intergral of AccelX & AccelY seperately over time
between the 2 button presses. These will be distX & distY respectively.
Total displacement will be sqrt( (distXsquared) + (distY squared) ).
GoodLUCK!!
Regards
CVS#2600Hertz
Just as a thought experiment, you should be able to do this using a combination of the accelerometer and the compass on each device.
However, whether the accuracy of these sensors is enough for what you want to do...well I think you'd just have to try it.