I'm working on a game in Unity where you can walk around in a city that also exists in real life.
In the game you should be able to enter real-world coordinates, or use your phone's GPS, and you'll be transported to the in-game position of those coordinates.
For this, i'd need to somehow convert the game coordinates to latitude and longitude coordinates. I have some coordinates from specific buildings, and i figured i might be able to write a script to determine the game coordinates from those reference points.
I've been searching for a bit on Google, and though i have probably come across the right solutions occasionally, i've been unable to understand them enough to use it in my code.
If someone has experience with this, or knows how i could do this, i'd appreciate it if you could help me understand it :)
Edit: Forgot to mention that other previous programmers have already placed the world at some position and rotation they felt like using, which unfortunately i can't simply change without breaking things.
Tim Falken
This is simple linear math. The main issues you'll come across is the fact that your game coordinate system will be probably be reversed along one or more axis. You'll probably need to reverse the direction along the latitude (Y) axis of your app. Aside from that it is just a simple conversion of the scales. Since you say that this is the map of a real place you should be able to easily figure out the min\max lon\lat which your map covers. Take the absolute value of the difference between these two values and divide that by the width\height of your map in each direction. This will be the change in latitude per map unit value. Store this value and it should be easy to convert both ways between the two units. Make functions that abstract the details and you should have no problems calculating this either way.
I assume that you have been able to retrieve the GPS coordinates OK.
EDIT:
By simple linear math I mean something like this (this is C++ style psuedo code and completely untested; in a real world example the constants would all be member variables instead):
define('MAP_WIDTH', 1000);
define('MAP_HEIGHT', 1000);
define('MIN_LON', 25.333);
define('MIN_LAT', 20.333);
define('MAX_LON', 27.25);
define('MAX_LAT', 20.50);
class CoordConversion {
float XScale=abs(MAX_LON-MIN_LON)/MAP_WIDTH;
float YScale=abs(MAX_LAT-MIN_LAT)/MAP_HEIGHT;
int LonDir = MIN_LON<MAX_LON?1:-1;
int LatDir = MIN_LAT<MAX_LAT?1:-1;
public static float GetXFromLon(float lon) {
return (this.LonDir>0?(lon-MIN_LON):(lon-MAX_LON))*this.XScale;
}
public static float GetYFromLat(float lat) {
return (this.LatDir >0?(lat-MIN_LAT):(lat-MAX_LAT))*this.YScale;
}
public static float GetLonFromX(float x) {
return (this.LonDir>0?MIN_LON:MAX_LON)+(x/this.XScale);
}
public static float GetLatFromY(float y) {
return (this.LonDir>0?MIN_LAT:MAX_LAT)+(y/this.YScale);
}
}
EDIT2: In the case that the map is rotated you'll want to use the minimum and maximum lon\lat actually shown on the map. You'll also need to rotate each point after the conversion. I'm not even going to attempt to get this right off the top of my head but I can give your the code you'll need:
POINT rotate_point(float cx,float cy,float angle,POINT p)
{
float s = sin(angle);
float c = cos(angle);
// translate point back to origin:
p.x -= cx;
p.y -= cy;
// rotate point
float xnew = p.x * c - p.y * s;
float ynew = p.x * s + p.y * c;
// translate point back:
p.x = xnew + cx;
p.y = ynew + cy;
}
This will need to be done in when returning a game point and also it needs to be done in reverse before using a game point to convert to a lat\lon point.
EDIT3: More help on getting the coordinates of your maps. First find the city or whatever it is on Google maps. Then you can right click the highest point (furthest north) on your maps and find the highest longitude. Repeat this for all four cardinal directions and you should be set.
Related
how do I get mouse world position. X Y plane only in unity . ScreenToWorldPosition isn't working. I think I need to cast a ray to mouse but not sure.
This is what I am using. doesnt seem to give the correct coordinates or right plane. need for targeting and raycasting.
private void Get3dMousePoint()
{
var screenPosition = Input.mousePosition;
screenPosition.z = 1;
worldPosition = mainCamera.ScreenToWorldPoint(screenPosition);
worldPosition.z = 0;
}
Just need XY coords.
I tried with ScreenToWorldPoint () and it works.
The key I think is in understanding the z coordinate of the position.
Geometrically, in 3D space we need 3 coordinates to define a point. With only 2 coordinates we have a straight line with variable z parameter. To obtain a point from that line, we must choose at what distance (i.e. set z) we want the point sought to be.
Obviously, since the camera is perspective, the coordinates you have at z = 1 are different from those at z = 100, differently from the 2D plane.
If you can figure out how far away, that is, to set the z correctly, you can find the point you want.
Just remember that the z must be greater than the minimum rendering distance of the chamber. I set that very value in the script.
Also remember that the resulting vector will have the z equal to the z position of the camera + the z value of the vector used in ScreenToWorldPoint.
void Get3dMousePoint()
{
Vector3 worldPosition = Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.nearClipPlane));
print(worldPosition);
}
if you think my answer helped you, you can mark it as accepted and vote positively. I would very much appreciate it :)
Hi all,
I'm trying to transform locations based upon longitude and latitude to a vector3 location, which will be placed on a sphere in Unity. However, the location seems to be constantly off (compared to the actual location).
I use the following code at the moment:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class testPosLatLong : MonoBehaviour {
public float longi;
public float lati;
public float radius;
public Transform marker;
// Use this for initialization
void Start () {
// Transfer to Radians from Degrees
float templongi = longi * Mathf.PI / 180;
float templati = lati * Mathf.PI / 180;
float Xpos = radius * Mathf.Cos(templati) * Mathf.Cos(templongi);
float Ypos = radius * Mathf.Cos(templati) * Mathf.Sin(templongi);
float Zpos = radius * Mathf.Sin(templati);
Debug.Log ("X, Y, Z" + Xpos + " " + Ypos + " " + Zpos);
// Set the X,Y,Z pos from the long and lat
Instantiate(marker);
marker.position = new Vector3 (Xpos, Zpos, Ypos);
}
// Update is called once per frame
void Update () {
}
}
I've tried to set the longitude and latitude to zero, which looks like it got the right position on the sphere:
But when I try the longitude and latitude of Amsterdam for example it gives the following results (left side), while it should be the result on the right side.
Am I missing something or what is going wrong here? I tried googling a lot, but couldn't find anything that might explain my current problem. The project itself can be found here: https://wetransfer.com/downloads/31303bd353fd9fde874e92338e68573120171205170107/1208ac%3E
Hope somebody can help.
I think your globe is what's wrong. It's tough to see from your images, but to me the equator looks like it's in slightly the wrong spot, and the North Pole looks to be too crowded with Greenland. I suspect is has to do with the projection you're using to paste the globe image onto the sphere.
Typically there are very complex ways of projecting 2D maps/images onto 3D surfaces. There's a whole field of geospatial analysis on how to do this accurately. It's a tough problem.
Also, The actual earth isn't exactly a sphere, so this could give you errors as well, but I would guess these would result in much smaller errors that what you describe. Accurate maps reperesent the 3D earth as a "geoid". to get locally accurate maps, people typically project data differently in small areas to maximize accuracy at a local scale. Otherwise you see large distortions. So as you zoom into a global map, the projections actually change significantly.
One possible way to test where the issue is coming from could be to set up control points of known coordinates both on your map image and on the globe, and then test to see if they line up. Like put a spike sticking out of the world at where Amsterdam SHOULD be on the globe, and where it is on the actual image. If these two don't line up then you can be pretty sure where your problem lies. This is called "Georeferencing".
I think for 3D projections you'd likely need at least 4 such "control points". You might take a look at the geospatial community on stackexchange for more detailed info on projecting maps and swapping between coordinate systems.
Check the following gif: https://i.gyazo.com/72998b8e2e3174193a6a2956de2ed008.gif
I want the cylinder to instantly change location to the nearest empty space on the plane as soon as I put a cube on the cylinder. The cubes and the cylinder have box colliders attached.
At the moment the cylinder just gets stuck when I put a cube on it, and I have to click in some direction to make it start "swimming" through the cubes.
Is there any easy solution or do I have to create some sort of grid with empty gameobjects that have a tag which tells me if there's an object on them or not?
This is a common problem in RTS-like video games, and I am solving it myself. This requires a breadth-first search algorithm, which means that you're checking the closest neighbors first. You're fortunate to only have to solve this problem in a gridded-environment.
Usually what programmers will do is create a queue and add each node (space) in the entire game to that queue until an empty space is found. It will start with e.g. the above, below, and adjacent spaces to the starting space, and then recursively move out, calling the same function inside of itself and using the queue to keep track of which spaces still need to be checked. It will also need to have a way to know whether a space has already been checked and avoid those spaces.
Another solution I'm conceiving of would be to generate a (conceptual) Archimedean spiral from the starting point and somehow check each space along that spiral. The tricky part would be generating the right spiral and checking it at just the right points in order to hit each space once.
Here's my quick-and-dirty solution for the Archimedean spiral approach in c++:
float x, z, max = 150.0f;
vector<pair<float, float>> spiral;
//Generate the spiral vector (run this code once and store the spiral).
for (float n = 0.0f; n < max; n += (max + 1.0f - n) * 0.0001f)
{
x = cos(n) * n * 0.05f;
z = sin(n) * n * 0.05f;
//Change 1.0f to 0.5f for half-sized spaces.
//fmod is float modulus (remainder).
x = x - fmod(x, 1.0f);
z = z - fmod(z, 1.0f);
pair<float, float> currentPoint = make_pair(x, z);
//Make sure this pair isn't at (0.0f, 0.0f) and that it's not already in the spiral.
if ((x != 0.0f || z != 0.0f) && find(spiral.begin(), spiral.end(), currentPoint) == spiral.end())
{
spiral.push_back(currentPoint);
}
}
//Loop through the results (run this code per usage of the spiral).
for (unsigned int n = 0U; n < spiral.size(); ++n)
{
//Draw or test the spiral.
}
It generates a vector of unique points (float pairs) that can be iterated through in order, which will allow you to draw or test every space around the starting space in a nice, outward (breadth-first), gridded spiral. With 1.0f-sized spaces, it generates a circle of 174 test points, and with 0.5f-sized spaces, it generates a circle of 676 test points. You only have to generate this spiral once and then store it for usage numerous times throughout the rest of the program.
Note:
This spiral samples differently as it grows further and further out from the center (in the for loop: n += (max + 1.0f - n) * 0.0001f).
If you use the wrong numbers, you could very easily break this code or cause an infinite loop! Use at your own risk.
Though more memory intensive, it is probably much more time-efficient than the traditional queue-based solutions due to iterating through each space exactly once.
It is not a 100% accurate solution to the problem, however, because it is a gridded spiral; in some cases it may favor the diagonal over the lateral. This is probably negligible in most cases though.
I used this solution for a game I'm working on. More on that here. Here are some pictures (the orange lines in the first are drawn by me in Paint for illustration, and the second picture is just to demonstrate what the spiral looks like if expanded):
I am starting to build an augmented reality app where you can place an image on the screen on your augmented reality camera view and it stays in that position on the Earth, so someone else with their camera view can come by and see it on the augmented reality camera view. For this I know I need to calculate some sort of distance factor along with azimuth and elevation.
So, I have already figured out how to send the object's graphics up to a server and retrieve it back, but how can I place it back on its original position, relative to Earth. I know I need to calculate its:
Altitude
Coordinates
Azimuth
Elevation
Distance
But how would I calculate these and account for them/piece them together. I hope you understand what I mean.
To refine your understanding let me give you a short demo of the app:
A man is in his house, he decides to place an image of a painting on one of his walls. He opens up the app which defaults to the augmented reality screen, he presses the plus button and adds an image from his photo library. Behind the scenes, it saves the location and positional data up to a server, someone with the app and its augmented reality screen comes by, it goes up to the server and finds images saved nearby, it then downloads the image and places it up on the wall so the other man can see it with his phone when he moves it by.
What approach should I take to achieve this? Any outline, links, resources, tutorials, thoughts, experience etc. Thanks! This was a generally hard question to write down, I hope you can understand. If not please tell me and I will reword.
Rohan
I'm working on two AR iOS apps which do the following: convert azimuth (compass, horizontal angle) and elevation (gyroscope, vertical angle) to a position in 3D space (e.g. spherical to cartesian).
The frameworks you need are:
CoreLocation
CoreMotion
Getting the geolocation (coordinates) is pretty straightforward for latitude, longitude, and altitude. You can easily find this information in several online sources, but this is the main call you need from the CLLocationManagerDelegate after you call startUpdatingLocation:
- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
latitude = (float) manager.location.coordinate.latitude;
longitude = (float) manager.location.coordinate.longitude;
altitude = (float) manager.location.altitude;
}
Getting the azimuth angle is also pretty straightforward, using the same delegate as the location after calling startUpdatingHeading:
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
azimuth = (float) manager.heading.magneticHeading;
}
Elevation is extracted from the gyroscope, which doesn't have a delegate but is also easy to set up. The call looks something like this (note: this works for my app running in landscape mode, check yours):
elevation = fabsf(self.motionManager.deviceMotion.attitude.roll);
Finally, you can convert your orientation coordinates into a 3D point like so:
- (GLKVector3)sphericalToCartesian:(float)radius azimuth:(float)theta elevation:(float)phi
{
// Convert Coordinates: Spherical to Cartesian
// Spherical: Radial Distance (r), Azimuth (θ), Elevation (φ)
// Cartesian: x, y, z
float x = radius * sinf(phi) * sinf(theta);
float y = radius * cosf(phi);
float z = radius * sinf(phi) * cosf(theta);
return GLKVector3Make(x, y, z);
}
For this last part be very wary of angle and axis naming conventions as they vary wildly from source to source. In my system, θ is the angle on the horizontal plane, φ is the angle on the vertical plane, x is left-right, y is down-up, and z is back-front.
As for distance, I'm not sure you really need to use it but if you do then just substitute it for "radius".
Hope that helps
Swift 3
Gyroscope code update:
import CoreMotion
...
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.startDeviceMotionUpdates(to: OperationQueue.current!) { deviceManager, error in
guard let dm = deviceManager else { return }
let roll = dm.attitude.roll
let pitch = dm.attitude.pitch
let yaw = dm.attitude.yaw
print("r: \(roll), p: \(pitch), y: \(yaw)")
}
I have 2 coordinates and would like to do something seemingly straightforward. I want to figure out, given:
1) Coordinate A
2) Course provided by Core Location
3) Coordinate B
the following:
1) Distance between A and B (can currently be done using distanceFromLocation) so ok on that one.
2) The course that should be taken to get from A to B (different from course currently traveling)
Is there a simple way to accomplish this, any third party or built in API?
Apple doesn't seem to provide this but I could be wrong.
Thanks,
~Arash
EDIT:
Thanks for the fast responses, I believe there may have been some confusion, I am looking to get the course (bearing from point a to point b in degrees so that 0 degrees = north, 90 degrees = east, similar to the course value return by CLLocation. Not trying to compute actual turn by turn directions.
I have some code on github that does that. Take a look at headingInRadians here. It is based on the Spherical Law of Cosines. I derived the code from the algorithm on this page.
/*-------------------------------------------------------------------------
* Given two lat/lon points on earth, calculates the heading
* from lat1/lon1 to lat2/lon2.
*
* lat/lon params in radians
* result in radians
*-------------------------------------------------------------------------*/
double headingInRadians(double lat1, double lon1, double lat2, double lon2)
{
//-------------------------------------------------------------------------
// Algorithm found at http://www.movable-type.co.uk/scripts/latlong.html
//
// Spherical Law of Cosines
//
// Formula: θ = atan2( sin(Δlong) * cos(lat2),
// cos(lat1) * sin(lat2) − sin(lat1) * cos(lat2) * cos(Δlong) )
// JavaScript:
//
// var y = Math.sin(dLon) * Math.cos(lat2);
// var x = Math.cos(lat1) * Math.sin(lat2) - Math.sin(lat1) * Math.cos(lat2) * Math.cos(dLon);
// var brng = Math.atan2(y, x).toDeg();
//-------------------------------------------------------------------------
double dLon = lon2 - lon1;
double y = sin(dLon) * cos(lat2);
double x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(dLon);
return atan2(y, x);
}
See How to get angle between two POI?
Depending on how much work you want to put in this one, I would suggest looking at Tree Traversal Algorithms (check the column on the right), things like A* alpha star, that you can use to find your find from one point to another, even if obstacles are in-between.
If I understand you correctly, you have the current location and you have some other location. You want to find the distance (as the crow flies) between the two points, and to find a walking path between the points.
To answer your first question, distanceFromLocation will find the distance across the earth's surface between 2 points, that is it follows the curvature of the earth, but it will give you the distance as the crow flies. So I think you're right about that.
The second question is a much harder. What you want to do is something called path-finding. Path finding, require's not only a search algorithm that will decide on the path, but you also need data about the possible paths. That is to say, if you want to find a path through the streets, the computer has to know how the streets are connected to each other. Furthermore, if you're trying to make a pathfinder that takes account for traffic and the time differences between taking two different possible paths, you will need a whole lot more data. It is for this reason that we usually leave these kinds of tasks up to big companies, with lots of resources, like Google, and Yahoo.
However, If you're still interested in doing it, check this out
http://www.youtube.com/watch?v=DoamZwkEDK0