OSRM. Calculate the distance between two points by setting the bearings - osrm

I need to calculate the distances between 2 cameras, I know the azimuths (bearings) of these cameras, so in bearings I set two values
Url = http://router.project-osrm.org/route/v1/driving/37.803292,55.810219;37.624094,55.730463.json?steps=false&geometries=geojson&annotations=true&overview=full&bearings=86,0
When I click on the link, I get the following error {"message":"Number of elements in bearings size 1 does not match coordinate size 2","code":"InvalidOptions"}

One bearing is set with two values: degree and bearing range.
For example, bearings=86,0 means bearing is 86 degrees with no range.
And, bearings=100,20 means bearing is in the range from 80 to 120.
You have to add a second bearing in the URL,
something like: bearings=86,10;0,10 where the range is +/-10 around the degree, for example.

Related

How to get distance bw 2 points in real world irrespective of diffrence in height?

i am using a pair of DW1000 UWB sensors and am able to get accurate distance bw them.
how can i get rid of (z1-z2) term in the final distance..i.e if both sensors are fixed at (x1,y1)and (x2,y2 ) respectively , how do i ensure that the distance (reported ) states constant even if i move the tags up or down
You need to give the (z1-z2) information to the anchor to calculate the horizontal distance. Use Pythagoras' Theorem. If z1-z2 is unknown, you need more sensors.

Finding the Lateral Distances between two GPS Points

I have two Set of GPS Points recorded from High Precision GPS Receivers which travels from Point A to Point B . please see the Image attached.
Black Point is the GPS points of Vehicle 1 and its path... Blue Point is the GPS Points of Vehicle 2 and its path. Vehicle 2 should follow the same path as Vehicle 1 , But there exists some deviation in pratcical scenerios. So. I need to Caluculate how much deviation it has .
I am trying to find the lateral distance between the vehicle 1 and nearest vehicle 2 GPS Point.
What I did till now ?
*) Since vehicle 1 is ahead of vehicle 2 and , vehicle 2 reaches vehicle1 (approx) after some time (say buffer as 5 ~10 seconds )....
I am calculating the arc distance between Vehicle 1 GPS Point and a set of Vehicle 2 GPS Points (buffer) and finding the minimum of those Arc Distances.
*) By finding the Minimum of Arc Distance , I am finding the GPS Point which is nearest to the vehicle 1 GPS Point.. now , I am stuck at finding the Lateral Distance between these two GPS Points in an effecient manner.
Please let me know if u have any questions or comments on my procedure..
I assume both routes start from the same position. so i would do the following:
Resample each route to get a new set of points with a known sampling step from on another (You can interpolate the new points from the old see for example https://www.mathworks.com/matlabcentral/answers/278615-how-to-create-points-set-on-2d-polyline)
Once i have two sets of points that start from the same location and are in a constant sampling step just calculate the distance between each corresponding points.

calculating the scores using matlab

I am working on calculating the scores for air rifle paper target. I'm able to calculate the distance from center of the image to the center of the bullet hole in Pixels.
Here's my code:
I = imread('Sample.jpg');
RGB = imresize(I,0.9);
imshow(RGB);
bw = im2bw(RGB,graythresh(getimage));
figure, imshow(bw);
bw2 = imfill(bw,'holes');
s = regionprops(bw2,'centroid');
centroids = cat(1,s.Centroid);
dist_from_center = norm(size(bw(:,:,1))/2 - centroids,2);
hold(imgca,'on');
plot(imgca,centroids(:,1),centroids(:,2),'r*');
hold(imgca,'off');
numberOfPixels = numel(I);
Number_Of_Pixel = numel(RGB);
This is the raw image with one bullet hole.
This is the result I am having.
This is the paper target I'm using to get the score.
Can any one suggest me how to calculate the score using this.
See my walk through your problem in Python
It's a very fun problem you have.
I assumed you have already a way of getting the binary holes mask (since you gave us the image)
Some scores are wrong because of target centering issues in given image
Given hole-mask, find 2D shot center
I assume that the actual images would include several holes instead of one.
Shot locations extracted by computing the local maxima of the distance transform of the binary hole image. Since the distance transform gives as intensity output the distance from the examined point to a border, this allows us to compute the centermost pixels as local maximum.
Local maximum technique I used is computing a maximum filter of your image with a given size (10 for me) and find the pixels that have filtered == original.
You have to remove the 0-valued "maxima" but apart from that it's a nice trick to remember, since it works in N-dimension by using a N-dimensional maximum filter.
Given a 2D position of shot center, compute the score
You need to transform your coordinate system from cartesian (X,Y) to polar (distance,angle).
Image from MathWorks to illustrate the math.
To use the center of image as reference point, offset each position by the image center vector.
Discarding the angle, your score is directly linked to the distance from center.
Your score is an integer that you need to compute based on distance :
As I understand you score 10 if you are at distance 0 and decrease till 0 points.
This means the scoring function is
border_space = 10 px # distance between each circle, up to you to find it :)
score = 10 - (distance / border_space) # integer division though
with the added constraint that score can not be negative :
score = max(10 - (distance / border_space),0)
Really do look through my ipython notebook, it's very visual
Edit: Regarding the distance conversion.
Your target practice image is in pixels, but these pixel distances can be mapped to millimeters : You probably know what your target's size is in centimeters (it's regulation size, right ?), so you can set up a conversion rate:
target_size_mm = 1000 # 1 meter = 1000 millimeters
target_size_px = 600 # to be measured for each image
px_to_mm_ratio = target_size_mm / target_size_px
object_size_px = 102 # any value you want to measure really
object_size_mm = object_size_px * px_to_mm_ratio
Everytime you're thinking about a facet of your problem, think "Is what I'm looking at in pixels or in millimeters ?". Try to conceptually separate the code that uses pixels from the one in millimeters.
It is coding best practice to avoid these assumptions where you can, so that if you get a bunch of images from different cameras, with different properties, you can "convert" everything to a common format (millimeters) and have a uniform treatment of data afterwards

QGIS: Creating regular equidistant points whith distance in meters

I need a set of equidistant points (like a raster) each in a 1km distance to the next over a certain area, in my case Germany. I know the feature "regular points" in QGis where I can do just that but the distance you can enter there is not in meters but in coordinates or something. I tried it in WGS 84 as well as in EPSG:31467 (Gauss-Kruger, common in Germany) which is supposed to be in meters but the results seem to be the same..
I just can't find a way to set a certain distance in meters.
Please help :) Thanks!

OpenNI range of returned coordinates

I am using the HandsGenerator class of OpenNI, and I want to use it to track the users' movements.
I've registered my own callback for getting the updated position of the hand, and everything works fine, except I can't find information about the coordinate system etc. of the returned XnPoint3D. Is there a spec somewhere that precisely specifies the X,Y,Z ranges, and perhaps scaling information (so that I would know that say a change of 100 in the XnPoint3D's X corresponds to a movement of 10 centimeters, or something).
The HandsGenerator returns real world coordinates in millimeters from the sensor. This means that depth points that are right in the middle of the depthmap will have an X and Y of 0.
A change of 100 (in X, Y, or Z) is indeed a change of 10 centimeters (100mm = 10cm).
The range of the X an Y values depends on the Z value of the hand point. Assuming you have a hand point at the top left of the depthmap (or 0,0 in projective coordinates) the possible X and Y values depend on how far away the hand is. The closer the hand, the smaller X and Y. To get the max range your hand positions can be you should choose an arbitrary max Z value and then find the X & Y values of the corners of the depth map at that distance. Or in other words - convert the projective coordinates (0,0,maxZ) and (DepthmapWidth,DepthmapHeight,maxZ) to real world coordinates. All hand points that have a Z value less than maxZ will fall between those 2 real world coordinates)
Note that you can convert projective coordinates to real world using DepthGenerator::ConvertProjectiveToRealWorld.