How do i plot markers with lambert coordinates on map instead of lat/long in react-leaflet? - leaflet

I do have some lambert coordinates like
[149031,114032]
[164787,156787]
Do I need to change these lat/long values to assign position attribute
or can i plot with the above coordinates
Is there any conversion avlble from lambert coordinates to lat/long values
I tried with L.point(149031,114032) but not didn't progress anything

Related

Is it possible to pass a custom position to calculate lighting in unity surface shaders?

I want to create a shader that uses different coordinates for light calculations than for what's being displayed. This probably sounds strange, but I would like to do this for lighting a top-down 2D game.
I want to write a vertex shader that offsets the Z coordinate by the value of the Y coordinate for display, but uses unmodified coordinates for lighting calculation.
Is this possible to do, and if so, where would I start?
So far I have a surface shader that offsets the Z coordinate by the value of the Y coordinate, but unity is using the modified coordinates to calculate lighting, I would like unity to use the unmodified coordinates for light calculations.

Display Earth map in Azimuthal Equidistant Projection

I have to display the Earth's map in azimuthal equidistant projection, by giving the lattitude and longnitude as input in Matlab. I am using the eqdazim projection, but I am still getting the map with point (0,0) in the center. I want to be able to change the point that is the center if the circle map.
landareas = shaperead('landareas.shp','UseGeoCoords',true);
axesm ('eqdazim', 'Frame', 'on');
geoshow(landareas);
Also, I don't know how to change the radius of the image. I don't need the circle with the whole Earth but instead something about ~2000km radius around center point.
I need it in order to put its image on a dome. Below is a simple example with such a dome and random fragment of the Earth's surface. Keep in mind, that it's just an image that I cropped manually from large Mercator map.
I was hoping that I can use the Mapping Toolbox in order to get such a map automatically, by giving lattitude, longnitude and radius. I have all necessary data, which is:
Lattitude & Longnitude
The radius of the circle
I just don't know how can I get this part of Earth's map. I think I have to use the azimuthal equidistant projection. I just don't know how to do this in MATLAB/Mapping Toolbox.

Coordinate of a pixel of an image to coordinate on the texture

A Texture2D is mapped on a sphere. Thanks to GetPixel(raycastHit.texturecoord) I can get the pixel value of this texture, when, for instance, a ray hit the sphere. So if I convert an image I to a texture T and map T on a GameObject G, I can get the pixel value from G to I (G--textureCoord-->T--GetPixel-->I). If I convert those (x,y) coordinates into world coordinates, I can trigger event on certain colors, so it works. I use this solution to perform pixel to world position.
But I can't do the opposite. Imagine that I have a list of bounding boxes for different objects on the image I (so with coordinates between I.width and I.height). If I convert those coordinates (bx1, bx2) into world coordinates it simply doesn't work.
I noticed that when I compare the GetPixel value when I target a given color on G with my controller, I don't get the same pixel coordinates as the one on the original image.
In the end I want to get the texture coordinate from an image (G<--textureCoord_to_word_coordinates--T<--?????---I).

unity3d fragment shader - distance from pixel to vertex?

I'm trying to create a star field fragment shader in which the vertices of a mesh are treated as stars. I want the color of the pixel to go between 0,0,0,1 and 1,1,1,1 as the distance from the pixel (interpolated from the position output of the vertex shader) to the vertex (output from the vertex shader in some way that will prevent it from getting interpolated) goes between some value to 0. I want the value to be calculated based on the z value of the vertex.
Is this possible? How can it be done?
I'm having trouble coming up with the distance from the interpolated vertex position to the non-interpolated one in the fragment shader. Maybe it could be done by somehow getting the screen coordinate of the vertex in the vertex shader, and passing that as a color value or something? Because I seem to be able to access the screen coordinates for a fragment, but not the world coordinates. (If I try to use anything that has position semantics in the fragment shader, I get an error, as I guess you'd expect.)
Additionally, I'd really like to also pass a (star) color value and size value along with a vertex, from the vertex shader, and have those affect the output of the fragment shader.

Cylindrical projection to sphere

I have a 2d array (lat*long) containing height information. I want to map this cylindrical projection to a actual sphere with radius r and plot it.
How would I do that? Sorry it so little info, but I'm completely lost right now ...
Longitude and latitude are not cylindrical coordinates; rather, they are equivalent to azimuth and elevation in spherical coordinates. At each latitude and longitude, you have a height (which may need to have the mean radius of the sphere added to it if it isn't the true height from the center already).
Check out the sph2cart function, which converts from spherical to cartesian coordinates. You'll have to convert from degrees to radians first.
Steps to take:
Create matrix (same size as original) with just longitudes.
Do the same for just latitudes (after this you should have 3 matrices of the same size as your original - latitude, longitude, height).
Make sure those latitude and longitude matrices are in
radians, not degrees
Make sure your height info is from the
center of the sphere
Use sph2cart to get x,y,z matrices.
Use surf(x,y,z) to plot the results
Notes on sph2cart from the documentation:
[x,y,z] = sph2cart(azimuth,elevation,r) transforms the corresponding
elements of spherical coordinate arrays to Cartesian, or xyz,
coordinates. azimuth, elevation, and r must all be the same size (or
any of them can be scalar). azimuth and elevation are angular
displacements in radians from the positive x-axis and from the x-y
plane, respectively.