I am creating a 3D sphere of the Earth for Unity and I need to have one polygon for each country. Each country has a set of 2D points which are proyected into 3D points based on proyection. But when I create the polygon some vertices are joining incorrectly.
I used Maya to create the polygons with this code
with open("C:\\Users\\patilanz\\Downloads\\geo_countries_medium.json") as f:
d = json.load(f)
def to_3d(x,y):
#convert from degrees to radians
longitude = math.radians(x)
latitude = math.radians(y)
# select a radius:
radius = 10
# project to 3d
return (
radius * math.cos(latitude) * math.cos(longitude),
radius * math.cos(latitude) * math.sin(longitude),
radius * math.sin(latitude)
)
for feat in d.get("features"):
r = []
coords = feat.get("geometry").get("coordinates")
type = feat.get("geometry").get("type")
for coord in coords:
for c in coord:
if type == "MultiPolygon":
r = []
for a in c:
r.append(to_3d(a[0],a[1]))
poly = cmds.polyCreateFacet(p=r)
poly = cmds.rename(feat.get("properties").get("name"))
else:
r.append(to_3d(c[0],c[1]))
if not type == "MultiPoligon":
poly = cmds.polyCreateFacet(p=r)
poly = cmds.rename(feat.get("properties").get("name"))
cmds.polySphere(r = 10 * 0.98)
The problem of incorrect joining vertices
I found that this guy was trying to do the same think https://forum.processing.org/one/topic/drawing-countries-on-top-of-a-3d-sphere-from-set-of-boundaries.html
But I don't know how to implement that in Maya or in Unity.
Related
I have a sphere and a texture for it.
Texture consist of 16 tiles of zoom = 2 from OSM. Tile size is 256x256.
At top and bottom I added space to cover area in ranges [90, 85.0511] and [-85.0511, -90], proportionally. So texture size was 1024x1083.
I also tried texture without these two spaces, its size was 1024x1024 (map tiles only).
The problem is that after UV mapping on Y-axis objects are smaller on equator and bigger on poles.
There are two types of formulas
u = (lon + 180) / 360; // lon = [-180, 180]
v = (lat + 90) / 180; // lat = [-85.0511, 85.0511]
----
u = Math.atan2(z, x) / (2 * Math.PI) + 0.5; // x, y, z are vertex coordinates
v = Math.asin(y) / Math.PI + 0.5;
I tried all 8 variations: two textures, two u-formulas and two v-formulas.
The result is like on image above, or worse.
What am I doing wrong? Is it about texture, or UV-formulas, or something else?
P.S.: for poles (vertices in lat range = [-90, -85.0511], [85.0511, 90]) in fragment shader I don't use color from texture, but just solid color
OSM uses the Web Mercator projection. See also on OSM wiki.
The conversion from world (x,y,z) to texture (u,v) coordinates would be:
lon = atan2(y, x)
lat = atan2(z, sqrt(x*x+y*y))
u = (lon + pi)/(2*pi)
v = (log(tan(lat/2 + pi/4)) + pi)/(2*pi)
(I assume that z points north like in WGS-84 and all coordinates are right-handed.)
This projection doesn't cover the entire sphere: as the latitude approaches the poles, the v coordinate blows up to infinity. Therefore extending the map to the north or south direction is not going to be helpful.
Instead keep the original square 1024x1024 texture and render a texture mapped sphere capped at the ±85.051129° latitute (that's where v = 0,1) using the above coordinate mapping.
Alternatively (and this is more in-line with Web Mercator spirit), render each tile regular in the UV coordinates, and calculate the XYZ coordinates by reversing the above transformation.
Given 2 coordinates (point 1 and 2 in red) in WGS84 I need to find the coordinates of the point perpendicular (point 3) to the line at a given distance.
I could manage to make the math to compute this perpendicular point, but when displayed on the map, the point seems to be at a wrong place, probably because of the projection.
What I want on a map:
And what I have instead on the map:
How can I take into account the projection so that the point on the map appears perpendicular to the line? The algorithm below to compute the point comes from here: https://math.stackexchange.com/questions/93424/calculate-rectangle-coordinates-from-line-and-height
public static Coords ComputePerpendicularPoint(Coords first, Coords last, double distance)
{
double slope = -(last.Lon.Value - first.Lon.Value) / (last.Lat.Value - first.Lat.Value);
// number of km per degree = ~111km (111.32 in google maps, but range varies between 110.567km at the equator and 111.699km at the poles)
// 1km in degree = 1 / 111.32km = 0.0089
// 1m in degree = 0.0089 / 1000 = 0.0000089
distance = distance * 0.0000089 / 100; //0.0000089 => represents around 1m in wgs84. /100 because distance is in cm
double t = distance / Math.Sqrt(1 + (slope * slope));
Coords perp_coord = new Coords();
perp_coord.Lon = first.Lon + t;
perp_coord.Lat = first.Lat + (t * slope);
return perp_coord;
}
Thank you in advance!
I'm trying to use Mapbox Terrain RGB to get elevation for specific points in space. I used mercantile.tile to get the coordinates of the tile containing my point at zoom level 15, which for -43º, -22º (for simplicity sake) is 12454, 18527, then mercantile.xy to get the corresponding world coordinates: -4806237.7150042495, -2621281.2257876047.
Shouldn't the integer part of -4806237.7150042495 / 256 (tile size) equal the x coordinate of the tile containing the point, that is, 12454? If this calculation checked out I'd figure that I'm looking for the pixel column (x axis) corresponding to the decimal part of the result, like column 127(256 * 0,5) for 12454,5. However, the division results in -18774.366, (which is curiously close to the tile y coordinate, but it looks like a coincidence). What am I missing here?
As an alternative, I thought of using mercantile.bounds, assigning the first and last pixel columns to the westmost and eastmost longitudes, and finding my position with interpolation, but I wanted to check if I'm doing this the right/recommended way. I'm interested in point elevations, so everything said here goes for the Y axis as well.
Here's what I got so far:
def correct_altitude_mode(kml):
with open(kml, "r+") as f:
txt = f.read()
if re.search("(?<=<altitudeMode>)relative(?=<\/altitudeMode>)", txt):
lat = round(float(find_with_re("latitude", txt)), 5)
lng = round(float(find_with_re("longitude", txt)), 5)
alt = round(float(find_with_re("altitude", txt)), 5)
z = 15
tile = mercantile.tile(lng, lat, z)
westmost, southmost, eastmost, northmost = mercantile.bounds(tile)
pixel_column = np.interp(lng, [westmost, eastmost], [0,256])
pixel_row = np.interp(lat, [southmost, northmost], [256, 0])
response = requests.get(f"https://api.mapbox.com/v4/mapbox.terrain-rgb/{z}/{tile.x}/{tile.y}.pngraw?access_token=pk.eyJ1IjoibWFydGltcGFzc29zIiwiYSI6ImNra3pmN2QxajBiYWUycW55N3E1dG1tcTEifQ.JFKSI85oP7M2gbeUTaUfQQ")
buffer = BytesIO(response.content)
tile_img = png.read_png_int(buffer)
_,R,G,B = (tile_img[int(pixel_row), int(pixel_column)])
print(tile_img[int(pixel_row), int(pixel_column)])
height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
print(f"R:{R},G:{G},B:{B}\n{height}")
plt.hlines(pixel_row, 0.0, 256.0, colors="r")
plt.vlines(pixel_column, 0.0, 256.0, colors="r")
plt.imshow(tile_img)
I have a bunch of 360 equirectangular images an on each image i want to place a point of interest. To make this easy i just want to have to determine the 2d location of this point on the image. See image below for clarification:
Lets say that the blue point has a pixel location of X: 3000 and Y: 1300.
And that the total dimensions of the image are 4096x2048.
Now i want to convert this point to a spherical location and then to a 3d location. I try to do this in the following way:
Vector3 PlaceMenu(Vector2 loc2d)
{
var phi = 2 * Mathf.PI * (loc2d.x / imageDimensions.x);
var theta = ( loc2d.y / imageDimensions.y) * Mathf.PI;
var pos = new Vector3(Mathf.Cos(phi) * Mathf.Sin(theta), Mathf.Sin(phi) * Mathf.Sin(theta), Mathf.Cos(theta));
pos *= offsetRadius;
return pos;
}
in this case offsetRadius is the radius of the sphere.
But the results i am getting with this code are weird. because the blue points appear on weird other locations then specified by the 2d location.
What am i doing wrong here?
If any more explaination is needed i am happy to provide!
In Unity3d,
I Created a sphere and i textured it with Earth texture.
I converted latitude and longitude into 3d coorinates
and i spawned a cube on that position.
the cube is on sphere but it is not accurate.
(for example, when i input "51.504815, -0.128506" for london, the cube is spawned at somewhere in france (44.854302, -0.595576))
i know the earth is not sphere but ellipsoid, and is this affect to result?
what should i do to get accurate converting result to spawn cube in right place?
below is converting function that i used, and i used another functions on the web, but it returned same result.
private Vector3 GetSphericalCoordinates(double latitude, double longitude)
{
latitude = latitude * Math.PI / 180D;
longitude = longitude * Math.PI / 180D;
double x = radius * Math.Cos(latitude) * Math.Sin(longitude);
double y = radius * Math.Sin(latitude);
double z = -radius * Math.Cos(latitude) * Math.Cos(longitude);
return new Vector3((float)x, (float)y, (float)z);
}
and latitude, longitude (0, 0), (90, 0) (0, 90)... is exactly on the quadrant. so i wonder why other lat, lng value is incorrect...
thanks in advance.
It was not accurate because of mesh count and texture UV set.
if you increase mesh count and make texture's UV correct spawned object will be on correct spot.