How to get distance from earth center? - distance

e = ephem.readtle(...)
e.compute('2012/02/04 07:55:00')
As far as I can see there's only e.elevation as a measure of distance which is relative to the sea level. At the moment I'm using a.elevation/1000 + 6371 to estimate the distance from the center of the earth.
I'm pretty sure that the exact earth center distance at the requested point in time is needed for the ephemeris calculations. Is this distance somewhere exposed and if not, why not and can that be changed?

I had thought that the answer would involving having to expose an ellipsoidal model of the earth from deep inside of the C code to Python to get you the information you need. But, having just gone through the Earth-satellite code, it turns out that the way that it converts the satellite's distance from the Earth center to its height is simply (from earthsat.c):
#if SSPELLIPSE
#else
*Height = r - EarthRadius;
#endif
Apparently the programmer planned to someday implement an ellipsoidal earth, and had an #if statement ready to guard the new code, but never wrote any.
So you can convert the height (“elevation”) back to a distance from the Earth's center by adding the value EarthRadius which is defined as:
#define EarthRadius 6378.16 /* Kilometers */
Since the elevation is, I believe, in meters, you will want to multiply EarthRadius by 1000.0 or else divide the elevation by 1000.0 to get the right result.

Related

AR place an object with same size

I am building an AR application. I have some points which are real worlds coordinates.
I can geolocate these points through Mapbox. My problem is that when I got far away from the points, they are looking getting smaller. I want to see them as the same size independently from the distance.
Here is an example of how to visualize the points:
So, if I near the points I see them in normal sizes. Even though I got 400 KMs away from the point, I want to see it in the same size. Is it possible?
You can try to scale the lables by some value * distance to object.
If you are standing in device and the target is in target it would be:
float experimentalScale = 0.5f
This is the amplifier of the distance. If you increase the value, the lable will get bigger by greater distance. Try out what works best for you.
float scaleFactor = Vector3.Distance(device.transform.position, target.transform.position) * experimentalScale;
target.transform.localScale(scaleFactor,scaleFactor,scaleFactor)
This only works if your Objects scale is 1. If it is something else, just multiply the scale with scaleFactor.

Computing sub-solar point

I am just getting started with PyEphem. My immediate task is, given a date and time compute the sub-solar point on Earth with latitude-longitude values. I'll dig into PyEphem to work this out but if someone has already done this, I'd appreciate sample code.
I went looking for the same answer as the OP. Many posts "mention" how PyEphem is the way to go but without providing the actual example.
Here is my working example to calculate the subsolar point. Mapping everything to a longitude between -180 and + 180 degrees.
greenwich = ephem.Observer()
greenwich.lat = "0"
greenwich.lon = "0"
greenwich.date = datetime.utcnow()
sun = ephem.Sun(greenwich)
sun.compute(greenwich.date)
sun_lon = math.degrees(sun.ra - greenwich.sidereal_time() )
if sun_lon < -180.0 :
sun_lon = 360.0 + sun_lon
elif sun_lon > 180.0 :
sun_lon = sun_lon - 360.0
sun_lat = math.degrees(sun.dec)
print "Subsolar Point Sun Lon:",sun_lon, "Lat:",sun_lat
I am no expert in PyEphem and there may be a better approach - but my testing so far has this work for my purposes.
p.s. yes.. Greenwich above is not actually set to the actual lat/lon... it's really only the Longitude of 0.0 that's needed to get the appropriate Sidereal time we need.
I cannot test actual code from where I am this morning, but: an object at declination ϕ should always ride right above the series of locations on earth that have latitude ϕ, so the latitude number is given to you directly by a body's .dec attribute (or .a_dec or .g_dec depending on your application).
Now, what about longitude?
Imagine the situation, which I suppose must occur roughly once a day, when Greenwich at 0° longitude looks up and sees the line in the sky of 0° right ascension right overhead. At that moment, a body in the sky at right ascension θ would be looking down at longitude θ assuming that longitude is positive going east, as is the case with PyEphem.
Now, what if Greenwich is looking up at a non-zero line of right ascension instead? Then it seems to me that we just need to subtract that from a body's right ascension in order to make longitude, because as the day proceeds and the Earth turns and lines of right ascension pass over Greenwich with bigger and bigger right ascensions assigned to them, any given body is going to pass west across the Earth and its longitude will dwindle and then go negative as it passes over the Western Hemisphere.
The line of right ascension overhead at Greenwich at any given moment can be determined by creating an Observer at 0° longitude and asking for its .sidereal_time() if I recall the Quick Reference correctly. So I think that the longitude beneath of a body might be:
lon = body.ra - greenwich.sidereral_time()
I will do a quick test with this later on today to see if reasonable numbers come out.

Angle to Mecca from current location with iPhone compass

I have a question about the Qibla direction, I am building an iPhone application which will show both North direction and Qibla direction, I am showing the north direction with the help of CLLocationManager and updating it with CLHeading as newHeading.magneticHeading, And i am showing the Qibla direction with the following code
double A = MECCA_LONGITUDE - lon;
double b = 90.0 - lat;
double c = 90.0 - MECCA_LATITUDE;
NSLog(#"tan -1( sin(%f) / ( sin(%f) * cot(%f) - cos(%f) * cos(%f)))", A, b, c, b, A);
double qibAngle = atan(sin(A) /( sin(b) * (1 / tan(c)) - cos(b) * cos(A) ));
NSLog(#"qib Angle- %f",qibAngle);
qibla.transform = CGAffineTransformMakeRotation(qibAngle * M_PI /180);
So, here i am getting the angle, but it does not update the angle when i rotate the device, Can anyone help me out, i know that i need to do some thing with heading , but i don't know what to do?
I assume the code you posted computes the angle between geographical north and the direction towards Mecca for the current location. All you need to do now is take into account the user's heading.
For example, suppose the user is located so Mecca is directly due West, and the user is facing directly due East. Since tan returns +/-90 degrees, the qibla angle would have to be -90 degrees. Now the adjustment should be obvious: you need to subtract 90 degrees from the qibla angle respective to geographical north (-90) to arrive at (-180) degrees, which is how much user needs to turn in order to face Mecca.
Simply put, you need to "undo" the user's deviation, and you do this by subtracting from the qibla angle the the user's heading, which is relative to geographical north.
With the maths out of the way, now you need to observe heading changes and recompute the qibla angle when the heading changes. Lastly, make sure to use the trueHeading property.
I'm probably going to lose points on this answer because I know absolutely nothing about ios, but, I believe atan returns a value in radians, and CGAffineTransformMakeRotation takes it's argument in radians as well , so the conversion qibAngle * M_PI /180 is not needed.
You might also want to re-title your post, since most people have no idea what Qibla is and wouldn't realize that it's about math and iOS. I only looked because I've heard calculating the right direction to Mecca is kind of a neat math problem.

Maths behind iPhone AR ToolKit

I'm using iPhone ARToolkit and I'm wondering how it works.
I want to know how with a destination location, a user location and a compass, this toolkit can know it user is looking to that destination.
How can I know the maths behind this calculations?
The maths that AR ToolKit uses is basic trigonometry. It doesn't use the technique that Thomas describes which I think would be a better approach (apart from step 5. See below)
Overview of the steps involved.
The iPhone's GPS supplies the device's location and you already have the coordinates of the location you want to look at.
First it calculates the difference between the latitude and the longitude values of the two points. These two difference measurements mean you can construct a right-angled triangle and calculate what angle from your current position another given position is. This is the relevant code:
- (float)angleFromCoordinate:(CLLocationCoordinate2D)first toCoordinate:(CLLocationCoordinate2D)second {
float longitudinalDifference = second.longitude - first.longitude;
float latitudinalDifference = second.latitude - first.latitude;
float possibleAzimuth = (M_PI * .5f) - atan(latitudinalDifference / longitudinalDifference);
if (longitudinalDifference > 0) return possibleAzimuth;
else if (longitudinalDifference < 0) return possibleAzimuth + M_PI;
else if (latitudinalDifference < 0) return M_PI;
return 0.0f;
}
At this point you can then read the compass value from the phone and determine what specific compass angle(azimuth) your device is pointing at. The reading from the compass will be the angle directly in the center of the camera's view. The AR ToolKit then calculates the full range of angle's currently displayed on screen as the iPhone's field of view is known.
In particular it does this by calculating what the angle of the leftmost part of the view is showing:
double leftAzimuth = centerAzimuth - VIEWPORT_WIDTH_RADIANS / 2.0;
if (leftAzimuth < 0.0) {
leftAzimuth = 2 * M_PI + leftAzimuth;
}
And then calculates the right most:
double rightAzimuth = centerAzimuth + VIEWPORT_WIDTH_RADIANS / 2.0;
if (rightAzimuth > 2 * M_PI) {
rightAzimuth = rightAzimuth - 2 * M_PI;
}
We now have:
The angle relative to our current position of something we want to display
A range of angles which are currently visible on the screen
This is enough to plot a marker on the screen in the correct position (kind of...see problems section below)
It also does similar calculations related to the devices inclination so if you look at the sky you hopefully won't see a city marker up there and if you point it at your feet you should in theory see cities on the opposite side of the planet. There are problems with these calculation in this toolkit however.
The problems...
Device orientation is not perfect
The value I've just explained the calculation of assumes you're holding the device in an exact position relative to the earth. i.e. perfectly landscape or portrait. Your user probably won't always be doing that. If you tilt the device slightly your horizon line will no longer be horizontal on screen.
The earth is actually 3D!
The earth is 3-dimensional. Few of the calculations in the toolkit account for that. The calculations it performs are only really accurate when you're pointing the device towards the horizon.
For example if you try to plot a point on the opposite side of the globe (directly under your feet) this toolkit behaves very strangely. The approach used to calculate the azimuth range on screen is only valid when looking at the horizon. If you point your camera at the floor you can actually see every single compass point. The toolkit however, thinks you're still only looking at compass reading ± (width of view / 2). If you rotate on the spot you'll see your marker move to edge of the screen, disappear and then reappear on the other side. What you would expect to see is the marker stay on screen as you rotate.
The solution
I've recently implemented an app with AR which I initially hoped AR Toolkit would do the heavy lifting for me. I came across the problems just described which aren't acceptable for my app so had to roll my own.
Thomas' approach is a good method up to point 5 which as I explained above only works when pointing towards the horizon. If you need to plot anything outside of that it breaks down. In my case I have to plot objects that are overhead so it's completely unsuitable.
I addressed this by using OpenGL ES to plot my markers where they actually are in 3D space and move the OpenGL viewport around according to readings from the gyroscope while continuously re-calibrating against the compass. The 3D engine handles all the hard work of determining what's on screen.
Hope that's enough to get you started. I wish I could provide more detail than that but short of posting a lot of hacky code I can't. This approach however did address both problems described above. I hope to open source that part of my code at some point but it's very rough and coupled to my problem domain at the moment.
that is all information needed. with iphone-location and destination-location you can calculate the destination-angle (with respect to true north).
The only missing thing is to know where the iPhone is currently looking at which is returned by the compass (magnetic north + current location -> true north).
edit: Calculations: (this is just an idea: there may exist a better solution without a lot coordinate-transformations)
convert current and destination location to ecef-coordinates
transform destination ecef coordinate to enu (east, north, up) local coordinate system with current location as reference location. You can also use this.
ignore the height-value and use the enu-coordinate to get the direction: atan2(deast, dnorth)
The compass returns already the angle the iPhone is looking at
display the destination on the screen if dest_angle - 10° <= compass_angle <= dest_angle + 10°
with respect to the cyclic-angle-space. The constant of 10° is just a guessed value. You should either try some values to find out a useful one or you have to analyse some properties of the iPhone-camera.
The coordinate-transformation-equations become much simpler if you assume that the earth is a sphere and not an ellipsoid. Most links if have postet are assuming an wgs-84 ellipsoid becasue gps also does afaik).

iOS - Center position of two coordinates

I would like to create a MKCoordinateRegion (to zoom to the good region on the map) from the northeast and southwest points given by Google. For that I need to compute the coordinate of the center between these two coordinates. Any clue? I could do simple math but I will have problems with the equator...
Thanks!!!
Assuming you mean anti-meridian and not the equator then here goes (While all this works on a flattened map and should be good enough for your purpose, it's completely bung on a sphere. see note at the bottom).
What I've done in other cases is start at either point, and if the next point is more than 180 degrees to the east, I convert it so that it is less than 180 to the west like so
if(pointa.lon - pointb.lon > 180)
pointb.lon += 360:
else if (pointa.lon - pointb.lon < -180)
pointb.lon -= 360
At this time pointb.lon might be an invalid longitude like 190 but you can at least work out the mid-point between pointa and point b because they will be on a continuous scale, so you might have points 175 and 190. Then just get the mid-point between them as 182.5, then convert that to make sure it is within the usual limits and you get -177.5 as the latitude between the two points. Working out the latitude is easy.
Of course on a sphere this is wrong because the midpoint between (-180,89) and (180,89) is (0*,90) not (0,89).
* = could be anything
Also, couldn't you just zoomToRect made with the defined corners? It'd save you doing this calculation and then next one which would be to work out what zoom level you need to be at when centered on that point to include the two corners you know about. Since the Maps app doesn't appear to scroll over the anti-meridian I assume MKMapview can't either so your rectangle is going to have to have the northeast coord as the top right and the southwest as the bottom left.
This SO post has the code to zoom a map view to fit all its annotations.