Getting angle between line and x-axis(horizontal top of screen) - javafx-8

I have created a basic program which draws shapes using graphics context. A line object holds its own start and end points. I wish to use the first quadrant values that are standard when building in java.
I have tried to create Point2D objects which gave me incorrect values. Also I have tried arctan2 and arctan which return the same value no matter which line I pass them. I cannot find my mistake whether it is my code or my math. Any suggestions will be appreciated.
double slope = (this.getEndY() - this.getyCoordinate()) / (this.getEndX() - this.getxCoordinate());
return MyLine.description + Math.toDegrees(Math.atan(slope));
this is using the Point2D with three points:
Point2D point = new Point2D(1, 0); // x- axis
Point2D point1 = new Point2D(this.getxCoordinate(),this.getyCoordinate()); //p1(0,0)
Point2D point2 = new Point2D(this.endX,this.getEndY());//p2(bottom right corner of window)
double angle = point1.angle(point, point2);
return MyLine.description + " " + angle;
The first attempt is the solution I found on the site already which returns NaN.
The second is an attemp to us the Point2d API which returns 0.0.
I am expecting it to read 45.0.

Related

how do I get mouse world position. X Y plane only in unity

how do I get mouse world position. X Y plane only in unity . ScreenToWorldPosition isn't working. I think I need to cast a ray to mouse but not sure.
This is what I am using. doesnt seem to give the correct coordinates or right plane. need for targeting and raycasting.
private void Get3dMousePoint()
{
var screenPosition = Input.mousePosition;
screenPosition.z = 1;
worldPosition = mainCamera.ScreenToWorldPoint(screenPosition);
worldPosition.z = 0;
}
Just need XY coords.
I tried with ScreenToWorldPoint () and it works.
The key I think is in understanding the z coordinate of the position.
Geometrically, in 3D space we need 3 coordinates to define a point. With only 2 coordinates we have a straight line with variable z parameter. To obtain a point from that line, we must choose at what distance (i.e. set z) we want the point sought to be.
Obviously, since the camera is perspective, the coordinates you have at z = 1 are different from those at z = 100, differently from the 2D plane.
If you can figure out how far away, that is, to set the z correctly, you can find the point you want.
Just remember that the z must be greater than the minimum rendering distance of the chamber. I set that very value in the script.
Also remember that the resulting vector will have the z equal to the z position of the camera + the z value of the vector used in ScreenToWorldPoint.
void Get3dMousePoint()
{
Vector3 worldPosition = Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.nearClipPlane));
print(worldPosition);
}
if you think my answer helped you, you can mark it as accepted and vote positively. I would very much appreciate it :)

Rotate a object in LatLng coordinate system

Hey there I am trying to rotate a line around its own center within the latlng system.
I got the angle and the two points. So I tried to append the rotation matrix, like this (following method takes the latitude and longitude of a point and the angle):
LatLng rotate(double lat, double long, double angle){
double rad = angle*pi/180;
double newLong = long*cos(rad)-lat*sin(rad);
double newLat = long* sin(rad) + lat*cos(rad);
return LatLng(newLat,newLong);
}
For example I got the point A (latitude:x,longitude:y) and the point B(latitude:x,longitude:y). Connecting these two points leads to a line. Now I want two rotate the line around it's own center with the above method, by calling:
LatLng newA = rotate(A.latitude,A.longitude);
LatLng newB = rotate(B.latitude,B.longitude);
But when I connect the two Points newA and NewB there is not the desired effect.
As #Abion47 clarified in his answer I need a rotation in 3-dimension, but how to do so? And is it possible with 2-dimension if it is a very small line?
So here's the rub. The problem I mentioned before is that a latitude-longitude pair are a pair of angles, not a 2D vector of a point on a graph, so trying to use them to rotate a point in 3D space on the surface of a sphere is going to run into its own problems. One thing that turns out, however, is that as long as you don't pick points that cross either the international date line or the poles, you can still use this trick by just pretending the angle pair is a 2D vector.
The real problem is that you are wanting to rotate the points around the midpoint, but your math is merely performing a straight rotation which will be rotating them around the origin instead (i.e. 0,0). You need to offset your "points" by the point you are using as a reference.
import 'dart:math';
LatLng rotate(LatLng coord, LatLng midpoint, double angle) {
// Make this constant so it doesn't have to be repeatedly recalculated
const piDiv180 = pi / 180;
// Convert the input angle to radians
final r = angle * piDiv180;
// Create local variables using appropriate nomenclature
final x = coord.longitude;
final y = coord.latitude;
final mx = midpoint.longitude;
final my = midpoint.latitude;
// Offset input point by the midpoint so the midpoint becomes the origin
final ox = x - mx;
final oy = y - my;
// Cache trig results because trig is expensive
final cosr = cos(r);
final sinr = sin(r);
// Perform rotation
final dx = ox * cosr - oy * sinr;
final dy = ox * sinr + oy * cosr;
// Undo the offset
return LatLng(dy + my, dx + mx);
}
Using this approach, I ended up with the following results:
The blue points are the input, the green point is the calculated midpoint, and the red points are each of the blue points passed through a 90 degree rotation.
(Note that the distance between the blue points appears to be farther than the distance between the red points. This is because I visualized the results in Google Maps which uses the Mercator projection, and that had the result of screwing with where the points appear to be relative to each other. If you were to visualize this on a globe, the points should appear the correct distance from each other.)

Unity3D - Calculating t-value position on orbit

I'm using a script that i found online that uses a kdTree to calculate the nearest point to an object on the surface of a mesh.
I have the following code in the OnDrawGizmos method that allows me to draw a circle that will orbit the surface of the object.
x = target.transform.position.x + ((Mathf.Cos(tValue)) * (radius));
z = target.transform.position.z + ((Mathf.Sin(tValue)) * (radius));
Gizmos.color = Color.yellow;
Gizmos.DrawWireSphere(new Vector3(x, y, z), 0.06f);
On the the object i am orbiting the "tValue" ranges from 0 to 6.3 to do a full orbit. My problem is that i am trying to calculate the tValue in the range 0-6.3 of an object that is near the central object. I have used my kdTree system to calculate the vector3 position on the surface of the object and it lines up perfectly.
I calculate the radius used in both the above and below equation with:
Vector3 RadiusDirection = (Vector3.ProjectOnPlane(orbitingSurfaceMeshPos, planet.transform.up) - Vector3.ProjectOnPlane(planet.transform.position, planet.transform.up));
float radius = RadiusDirection.magnitude;
However, when i try to calculate the t-value, i get a completely different value. I figured i could just "reverse" the "equation" and so i've been doing:
float temp = orbiting.z - planet.transform.position.z;
temp = temp / radius;
calculatedTvalue = (Mathf.Asin(temp));
What could i be doing wrong? I have tested my "reversing equation" in an empty scene and new script and it worked fine, if i just took the result of the orbit position calculation and directly reversed it. However, it doesn't work in my game.

Calculating coordinates from reference points

I'm working on a game in Unity where you can walk around in a city that also exists in real life.
In the game you should be able to enter real-world coordinates, or use your phone's GPS, and you'll be transported to the in-game position of those coordinates.
For this, i'd need to somehow convert the game coordinates to latitude and longitude coordinates. I have some coordinates from specific buildings, and i figured i might be able to write a script to determine the game coordinates from those reference points.
I've been searching for a bit on Google, and though i have probably come across the right solutions occasionally, i've been unable to understand them enough to use it in my code.
If someone has experience with this, or knows how i could do this, i'd appreciate it if you could help me understand it :)
Edit: Forgot to mention that other previous programmers have already placed the world at some position and rotation they felt like using, which unfortunately i can't simply change without breaking things.
Tim Falken
This is simple linear math. The main issues you'll come across is the fact that your game coordinate system will be probably be reversed along one or more axis. You'll probably need to reverse the direction along the latitude (Y) axis of your app. Aside from that it is just a simple conversion of the scales. Since you say that this is the map of a real place you should be able to easily figure out the min\max lon\lat which your map covers. Take the absolute value of the difference between these two values and divide that by the width\height of your map in each direction. This will be the change in latitude per map unit value. Store this value and it should be easy to convert both ways between the two units. Make functions that abstract the details and you should have no problems calculating this either way.
I assume that you have been able to retrieve the GPS coordinates OK.
EDIT:
By simple linear math I mean something like this (this is C++ style psuedo code and completely untested; in a real world example the constants would all be member variables instead):
define('MAP_WIDTH', 1000);
define('MAP_HEIGHT', 1000);
define('MIN_LON', 25.333);
define('MIN_LAT', 20.333);
define('MAX_LON', 27.25);
define('MAX_LAT', 20.50);
class CoordConversion {
float XScale=abs(MAX_LON-MIN_LON)/MAP_WIDTH;
float YScale=abs(MAX_LAT-MIN_LAT)/MAP_HEIGHT;
int LonDir = MIN_LON<MAX_LON?1:-1;
int LatDir = MIN_LAT<MAX_LAT?1:-1;
public static float GetXFromLon(float lon) {
return (this.LonDir>0?(lon-MIN_LON):(lon-MAX_LON))*this.XScale;
}
public static float GetYFromLat(float lat) {
return (this.LatDir >0?(lat-MIN_LAT):(lat-MAX_LAT))*this.YScale;
}
public static float GetLonFromX(float x) {
return (this.LonDir>0?MIN_LON:MAX_LON)+(x/this.XScale);
}
public static float GetLatFromY(float y) {
return (this.LonDir>0?MIN_LAT:MAX_LAT)+(y/this.YScale);
}
}
EDIT2: In the case that the map is rotated you'll want to use the minimum and maximum lon\lat actually shown on the map. You'll also need to rotate each point after the conversion. I'm not even going to attempt to get this right off the top of my head but I can give your the code you'll need:
POINT rotate_point(float cx,float cy,float angle,POINT p)
{
float s = sin(angle);
float c = cos(angle);
// translate point back to origin:
p.x -= cx;
p.y -= cy;
// rotate point
float xnew = p.x * c - p.y * s;
float ynew = p.x * s + p.y * c;
// translate point back:
p.x = xnew + cx;
p.y = ynew + cy;
}
This will need to be done in when returning a game point and also it needs to be done in reverse before using a game point to convert to a lat\lon point.
EDIT3: More help on getting the coordinates of your maps. First find the city or whatever it is on Google maps. Then you can right click the highest point (furthest north) on your maps and find the highest longitude. Repeat this for all four cardinal directions and you should be set.

How can I correctly calculate the direction for a moving object?

I'm solving the following problem: I have an object and I know its position now and its position 300ms ago. I assume the object is moving. I have a point to which I want the object to get.
What I need is to get the angle from my current object to the destination point in such a format that I know whether to turn left or right.
The idea is to assume the current angle from the last known position and the current position.
I'm trying to solve this in MATLAB. I've tried using several variations with atan2 but either I get the wrong angle in some situations (like when my object is going in circles) or I get the wrong angle in all situations.
Examples of code that screws up:
a = new - old;
b = dest - new;
alpha = atan2(a(2) - b(2), a(1) - b(1);
where new is the current position (eg. x = 40; y = 60; new = [x y];), old is the 300ms old position and dest is the destination point.
Edit
Here's a picture to demonstrate the problem with a few examples:
In the above image there are a few points plotted and annotated. The black line indicates our estimated current facing of the object.
If the destination point is dest1 I would expect an angle of about 88°.
If the destination point is dest2 I would expect an angle of about 110°.
If the destination point is dest3 I would expect an angle of about -80°.
Firstly, you need to note the scale on the sample graph you show above. The x-axis ticks move in steps of 1, and the y-axis ticks move in steps of 20. The picture with the two axes appropriately scaled (like with the command axis equal) would be a lot narrower than you have, so the angles you expect to get are not right. The expected angles will be close to right angles, just a few degrees off from 90 degrees.
The equation Nathan derives is valid for column vector inputs a and b:
theta = acos(a'*b/(sqrt(a'*a) * sqrt(b'*b)));
If you want to change this equation to work with row vectors, you would have to switch the transpose operator in both the calculation of the dot product as well as the norms, like so:
theta = acos(a*b'/(sqrt(a*a') * sqrt(b*b')));
As an alternative, you could just use the functions DOT and NORM:
theta = acos(dot(a,b)/(norm(a)*norm(b)));
Finally, you have to account for the direction, i.e. whether the angle should be positive (turn clockwise) or negative (turn counter-clockwise). You can do this by computing the sign of the z component for the cross product of b and a. If it's positive, the angle should be positive. If it's negative, the angle should be negative. Using the function SIGN, our new equation becomes:
theta = sign(b(1)*a(2)-b(2)*a(1)) * acos(dot(a,b)/(norm(a)*norm(b)));
For your examples, the above equation gives an angle of 88.85, 92.15, and -88.57 for your three points dest1, dest2, and dest3.
NOTE: One special case you will need to be aware of is if your object is moving directly away from the destination point, i.e. if the angle between a and b is 180 degrees. In such a case you will have to pick an arbitrary turn direction (left or right) and a number of degrees to turn (180 would be ideal ;) ). Here's one way you could account for this condition using the function EPS:
theta = acos(dot(a,b)/(norm(a)*norm(b))); %# Compute theta
if abs(theta-pi) < eps %# Check if theta is within some tolerance of pi
%# Pick your own turn direction and amount here
else
theta = sign(b(1)*a(2)-b(2)*a(1))*theta; %# Find turn direction
end
You can try using the dot-product of the vectors.
Define the vectors 'a' and 'b' as:
a = new - old;
b = dest - new;
and use the fact that the dot product is:
a dot b = norm2(a) * norm2(b) * cos(theta)
where theta is the angle between two vectors, and you get:
cos(theta) = (a dot b)/ (norm2(a) * norm2(b))
The best way to calculate a dot b, assuming they are column vectors, is like this:
a_dot_b = a'*b;
and:
norm2(a) = sqrt(a'*a);
so you get:
cos(theta) = a'*b/(sqrt((a'*a)) * sqrt((b'*b)))
Depending on the sign of the cosine you either go left or right
Essentially you have a line defined by the points old and new and wish to determine if dest is on right or the left of that line? In which case have a look at this previous question.