Longitude and Latitude to location on sphere in Unity - unity3d

Hi all,
I'm trying to transform locations based upon longitude and latitude to a vector3 location, which will be placed on a sphere in Unity. However, the location seems to be constantly off (compared to the actual location).
I use the following code at the moment:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class testPosLatLong : MonoBehaviour {
public float longi;
public float lati;
public float radius;
public Transform marker;
// Use this for initialization
void Start () {
// Transfer to Radians from Degrees
float templongi = longi * Mathf.PI / 180;
float templati = lati * Mathf.PI / 180;
float Xpos = radius * Mathf.Cos(templati) * Mathf.Cos(templongi);
float Ypos = radius * Mathf.Cos(templati) * Mathf.Sin(templongi);
float Zpos = radius * Mathf.Sin(templati);
Debug.Log ("X, Y, Z" + Xpos + " " + Ypos + " " + Zpos);
// Set the X,Y,Z pos from the long and lat
Instantiate(marker);
marker.position = new Vector3 (Xpos, Zpos, Ypos);
}
// Update is called once per frame
void Update () {
}
}
I've tried to set the longitude and latitude to zero, which looks like it got the right position on the sphere:
But when I try the longitude and latitude of Amsterdam for example it gives the following results (left side), while it should be the result on the right side.
Am I missing something or what is going wrong here? I tried googling a lot, but couldn't find anything that might explain my current problem. The project itself can be found here: https://wetransfer.com/downloads/31303bd353fd9fde874e92338e68573120171205170107/1208ac%3E
Hope somebody can help.

I think your globe is what's wrong. It's tough to see from your images, but to me the equator looks like it's in slightly the wrong spot, and the North Pole looks to be too crowded with Greenland. I suspect is has to do with the projection you're using to paste the globe image onto the sphere.
Typically there are very complex ways of projecting 2D maps/images onto 3D surfaces. There's a whole field of geospatial analysis on how to do this accurately. It's a tough problem.
Also, The actual earth isn't exactly a sphere, so this could give you errors as well, but I would guess these would result in much smaller errors that what you describe. Accurate maps reperesent the 3D earth as a "geoid". to get locally accurate maps, people typically project data differently in small areas to maximize accuracy at a local scale. Otherwise you see large distortions. So as you zoom into a global map, the projections actually change significantly.
One possible way to test where the issue is coming from could be to set up control points of known coordinates both on your map image and on the globe, and then test to see if they line up. Like put a spike sticking out of the world at where Amsterdam SHOULD be on the globe, and where it is on the actual image. If these two don't line up then you can be pretty sure where your problem lies. This is called "Georeferencing".
I think for 3D projections you'd likely need at least 4 such "control points". You might take a look at the geospatial community on stackexchange for more detailed info on projecting maps and swapping between coordinate systems.

Related

Reconstruct near plane ray intersection with camera

I am trying to reconstruct the point where the ray of the camera rendering the current pixel intersects the near plane.
I need the coordinates of the intersection point in the local coordinates of the object being rendered.
This is my current implementation:
float4 nearClipLS = mul(inv_modelViewProjectionMatrix , float4((i.vertex.x / i.vertex.w), (i.vertex.y / i.vertex.w),-1., 1.)); nearClipLS /= nearClipLS.w;
There's got to be a more efficient way to do it, but the following should, in theory, work.
Find the offset vector from the camera to the pixel:
float3 cam2pos = v.worldPos - _WorldSpaceCameraPos;
Get the camera's forward vector:
float3 camFwd = UNITY_MATRIX_IT_MV[2].xyz;
Get the dot product of the two to determine how far the point projects in the direction of the camera's forward axis:
float projDist = dot(cam2pos, camFwd);
Then, you should be able to use that data to re-project the point onto the near clip plane:
float nearClipZ = _ProjectionParams.y;
float3 nearPos = _WorldSpaceCameraPos+ (cam2pos * (nearClipZ / projDist));
This solution doesn't address edge cases (like when it's even with or behind the camera, which could cause problems), so you may want to check those once you get it working.

Car Collision Return Force - 3D Car Game

As per my game requirements, I was giving manual force when two cars collide with each other and move back.
So I want the correct code that can justify this. Here is the example, collision response that I want to get:
As per my understanding, I have written this code:
Vector3 reboundDirection = Vector3.Normalize(transform.position - other.transform.position);
reboundDirection.y = 0f;
int i = 0;
while (i < 3)
{
myRigidbody.AddForce(reboundDirection * 100f, ForceMode.Force);
appliedSpeed = speed * 0.5f;
yield return new WaitForFixedUpdate();
i++;
}
I am moving, my cars using this code:
//Move the player forward
appliedSpeed += Time.deltaTime * 7f;
appliedSpeed = Mathf.Min(appliedSpeed, speed);
myRigidbody.velocity = transform.forward * appliedSpeed;
Still, as per my observation, I was not getting, collision response in the proper direction. What is the correct way for getting above image reference collision response?
Until you clarify why you have use manual forces or how you handle forces generated by Unity Engine i would like to stress one problem in your approach. You calculate direction based on positions but positions are the center of your cars. Therefore, you are not getting a correct direction as you can see from the image below:
You calculate the direction between two pivot or center points therefore, your force is a bit tilted in left image. Instead of this you can use ContactPoint and then calculate the direction.
As more detailed information so that OP can understand what i said! In the above image you can see the region with blue rectangle. You will get all the contact points for the corresponding region using Collision.contacts
then calculate the center point or centroid like this
Vector3 centroid = new Vector3(0, 0, 0);
foreach (ContactPoint contact in col.contacts)
{
centroid += contact.point;
}
centroid = centroid / col.contacts.Length;
This is the center of the rectangle to find the direction you need to find its projection on your car like this:
Vector3 projection = gameObject.transform.position;
projection.x = centroid.x;
gameObject.GetComponent<Rigidbody>().AddForce((projection - centroid )*100, ForceMode.Impulse);
Since i do not know your set up i just got y and z values from car's position but x value from centroid therefore you get a straight blue line not an arrow tilted to left like in first image even in the case two of second image. I hope i am being clear.

Unity - Sin & Cos circular motion with rotation

I am trying to calculate circular motion (orbit) around an object. The code i have gives me a nice circular orbit around the object. The problem is that when i rotate the object, the orbit behaves as though the object were not rotated.
I've put a really simple diagram below to try and explain it better. The left is what i get when the cylinder is upright, the middle is what i currently get when the object is rotated. The image on the right is what i would like to happen.
float Gx = target.transform.position.x - ((Mathf.Cos(currentTvalue)) * (radius));
float Gz = target.transform.position.z - ((Mathf.Sin(currentTvalue)) * (radius));
float Gy = target.transform.position.y;
Gizmos.color = Color.green;
Gizmos.DrawWireSphere(new Vector3(Gx, Gy, Gz), 0.03f);
How can i get the orbit to change with the objects rotation? I have tried multiplying the orbit poisition "new Vector3(Gx,Gy,Gz)" by the rotation of the object:
Gizmos.DrawWireSphere(target.transform.rotation*new Vector3(Gx, Gy, Gz), 0.03f);
but that didn't seem to do anything?
That is happening because you are calculating the vector (Gx, Gy, Gz) in world space coordinates, where the target object's rotations are not taken in consideration.
One way to solve your needs is to calculate this rotation using the target object's local space coordinates, and then convert them to world space coordinates. This will correctly make your calculations consider the rotation of the target object.
float Gx = target.transform.localPosition.x - ((Mathf.Cos(currentTvalue)) * (radius));
float Gz = target.transform.localPosition.z - ((Mathf.Sin(currentTvalue)) * (radius));
float Gy = target.transform.localPosition.y;
Vector3 worldSpacePoint = target.transform.TransformPoint(Gx, Gy, Gz);
Gizmos.color = Color.green;
Gizmos.DrawWireSphere(worldSpacePoint, 0.03f);
Notice that instead of target.transform.position, which retrieves the world space coordinates of the given transform, I am doing the calculations using the target.transform.localPosition, which retrieves the local space coordinates of the given transform.
Also, I am calling the TransformPoint() method, which converts the coordinates which I have calculated in local space to its corresponding values in world space.
Then you might safely call the Gizmos.DrawWireSphere() method, which requires world space coordinates to work correctly.

Calculating coordinates from reference points

I'm working on a game in Unity where you can walk around in a city that also exists in real life.
In the game you should be able to enter real-world coordinates, or use your phone's GPS, and you'll be transported to the in-game position of those coordinates.
For this, i'd need to somehow convert the game coordinates to latitude and longitude coordinates. I have some coordinates from specific buildings, and i figured i might be able to write a script to determine the game coordinates from those reference points.
I've been searching for a bit on Google, and though i have probably come across the right solutions occasionally, i've been unable to understand them enough to use it in my code.
If someone has experience with this, or knows how i could do this, i'd appreciate it if you could help me understand it :)
Edit: Forgot to mention that other previous programmers have already placed the world at some position and rotation they felt like using, which unfortunately i can't simply change without breaking things.
Tim Falken
This is simple linear math. The main issues you'll come across is the fact that your game coordinate system will be probably be reversed along one or more axis. You'll probably need to reverse the direction along the latitude (Y) axis of your app. Aside from that it is just a simple conversion of the scales. Since you say that this is the map of a real place you should be able to easily figure out the min\max lon\lat which your map covers. Take the absolute value of the difference between these two values and divide that by the width\height of your map in each direction. This will be the change in latitude per map unit value. Store this value and it should be easy to convert both ways between the two units. Make functions that abstract the details and you should have no problems calculating this either way.
I assume that you have been able to retrieve the GPS coordinates OK.
EDIT:
By simple linear math I mean something like this (this is C++ style psuedo code and completely untested; in a real world example the constants would all be member variables instead):
define('MAP_WIDTH', 1000);
define('MAP_HEIGHT', 1000);
define('MIN_LON', 25.333);
define('MIN_LAT', 20.333);
define('MAX_LON', 27.25);
define('MAX_LAT', 20.50);
class CoordConversion {
float XScale=abs(MAX_LON-MIN_LON)/MAP_WIDTH;
float YScale=abs(MAX_LAT-MIN_LAT)/MAP_HEIGHT;
int LonDir = MIN_LON<MAX_LON?1:-1;
int LatDir = MIN_LAT<MAX_LAT?1:-1;
public static float GetXFromLon(float lon) {
return (this.LonDir>0?(lon-MIN_LON):(lon-MAX_LON))*this.XScale;
}
public static float GetYFromLat(float lat) {
return (this.LatDir >0?(lat-MIN_LAT):(lat-MAX_LAT))*this.YScale;
}
public static float GetLonFromX(float x) {
return (this.LonDir>0?MIN_LON:MAX_LON)+(x/this.XScale);
}
public static float GetLatFromY(float y) {
return (this.LonDir>0?MIN_LAT:MAX_LAT)+(y/this.YScale);
}
}
EDIT2: In the case that the map is rotated you'll want to use the minimum and maximum lon\lat actually shown on the map. You'll also need to rotate each point after the conversion. I'm not even going to attempt to get this right off the top of my head but I can give your the code you'll need:
POINT rotate_point(float cx,float cy,float angle,POINT p)
{
float s = sin(angle);
float c = cos(angle);
// translate point back to origin:
p.x -= cx;
p.y -= cy;
// rotate point
float xnew = p.x * c - p.y * s;
float ynew = p.x * s + p.y * c;
// translate point back:
p.x = xnew + cx;
p.y = ynew + cy;
}
This will need to be done in when returning a game point and also it needs to be done in reverse before using a game point to convert to a lat\lon point.
EDIT3: More help on getting the coordinates of your maps. First find the city or whatever it is on Google maps. Then you can right click the highest point (furthest north) on your maps and find the highest longitude. Repeat this for all four cardinal directions and you should be set.

iOS - Calculating distance, azimuth, elevation and relative position (Augmented Reality)

I am starting to build an augmented reality app where you can place an image on the screen on your augmented reality camera view and it stays in that position on the Earth, so someone else with their camera view can come by and see it on the augmented reality camera view. For this I know I need to calculate some sort of distance factor along with azimuth and elevation.
So, I have already figured out how to send the object's graphics up to a server and retrieve it back, but how can I place it back on its original position, relative to Earth. I know I need to calculate its:
Altitude
Coordinates
Azimuth
Elevation
Distance
But how would I calculate these and account for them/piece them together. I hope you understand what I mean.
To refine your understanding let me give you a short demo of the app:
A man is in his house, he decides to place an image of a painting on one of his walls. He opens up the app which defaults to the augmented reality screen, he presses the plus button and adds an image from his photo library. Behind the scenes, it saves the location and positional data up to a server, someone with the app and its augmented reality screen comes by, it goes up to the server and finds images saved nearby, it then downloads the image and places it up on the wall so the other man can see it with his phone when he moves it by.
What approach should I take to achieve this? Any outline, links, resources, tutorials, thoughts, experience etc. Thanks! This was a generally hard question to write down, I hope you can understand. If not please tell me and I will reword.
Rohan
I'm working on two AR iOS apps which do the following: convert azimuth (compass, horizontal angle) and elevation (gyroscope, vertical angle) to a position in 3D space (e.g. spherical to cartesian).
The frameworks you need are:
CoreLocation
CoreMotion
Getting the geolocation (coordinates) is pretty straightforward for latitude, longitude, and altitude. You can easily find this information in several online sources, but this is the main call you need from the CLLocationManagerDelegate after you call startUpdatingLocation:
- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
latitude = (float) manager.location.coordinate.latitude;
longitude = (float) manager.location.coordinate.longitude;
altitude = (float) manager.location.altitude;
}
Getting the azimuth angle is also pretty straightforward, using the same delegate as the location after calling startUpdatingHeading:
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
azimuth = (float) manager.heading.magneticHeading;
}
Elevation is extracted from the gyroscope, which doesn't have a delegate but is also easy to set up. The call looks something like this (note: this works for my app running in landscape mode, check yours):
elevation = fabsf(self.motionManager.deviceMotion.attitude.roll);
Finally, you can convert your orientation coordinates into a 3D point like so:
- (GLKVector3)sphericalToCartesian:(float)radius azimuth:(float)theta elevation:(float)phi
{
// Convert Coordinates: Spherical to Cartesian
// Spherical: Radial Distance (r), Azimuth (θ), Elevation (φ)
// Cartesian: x, y, z
float x = radius * sinf(phi) * sinf(theta);
float y = radius * cosf(phi);
float z = radius * sinf(phi) * cosf(theta);
return GLKVector3Make(x, y, z);
}
For this last part be very wary of angle and axis naming conventions as they vary wildly from source to source. In my system, θ is the angle on the horizontal plane, φ is the angle on the vertical plane, x is left-right, y is down-up, and z is back-front.
As for distance, I'm not sure you really need to use it but if you do then just substitute it for "radius".
Hope that helps
Swift 3
Gyroscope code update:
import CoreMotion
...
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.startDeviceMotionUpdates(to: OperationQueue.current!) { deviceManager, error in
guard let dm = deviceManager else { return }
let roll = dm.attitude.roll
let pitch = dm.attitude.pitch
let yaw = dm.attitude.yaw
print("r: \(roll), p: \(pitch), y: \(yaw)")
}