I have the ephemeris for an Earth observing satellite from a TLE, so I can get the location of the satellite (lat/lon). The documentation has a nice example for the ISS, showing how that is done.
However, what I am actually after is the location of the point of the Earth's surface that the satellite's sensor is pointing at. Assume that the sensor is pointing perpendicular to the flight direction, so I should be able to calculate the azimuth direction using the satellite's inclination. Then I have an idea of the angle the satellite's sensor is using to point to the ground. Some simple geometry should give me the altitude I need.
The question is now: How do I string these two things together? The compute(observer) calculation example returns altitude and azimuth for a location (lat/lon). However, I could not find anything that does that the other way around. Any thoughts how to go about this? Any clues are much appreciated.
Related
I am trying to build a smile detector over a real time video (front cam) using Uibezierpath over the screen coordinates by detecting face landmarks using VNDetectFaceLandmarksRequest and "Landmarks.outerlips", calculating Y offset between upper points, without using CoreML ideally - but I seem only able to get the normalised points for the landmark, where these points have their own coordinate system. I'm not sure how to convert each point to the screen coordinate system.
This answer from #Rickster seems to be in the right direction but I'm not able to fully grasp next steps:
How to convert normalized points retrived from VNFaceLandmarkRegion2D
Current output:
Desired result:
I am working on an IOS AR project, now i 've done with Camera, GPS with altitude, Compass heading but i can not get the right vector of gravity to draw the right horizontal plane. Please help me with my problem.
- Calculate horizontal plane and draw on the camera view. (And with altitude will be so good)
- Maybe project wll help me a lot.
Please help.
Thanks you very much.
Record some filtered accelerometer samples while not moving the device.
Compute the average of all those samples to get the down vector.
Vectors perpendicular to it make up the horizontal planes. Taking the dot product of the down vector with a vector along the Z axis (0,0,1) would allow you to figure out the angle of the screen relative to the horizon (see accelerometer axes)
I haven't tried this, but that would be my approach... hope it helps somehow
I have data describing a rotated ellipse (the center of the ellipse in latitude longitude coordinates, the lengths of the major and minor axes in kilometers, and the angle that the ellipse is oriented). I do not know the location of the foci, but assume there is a way to figure them out somehow. I would like to determine if a specific latitude longitude point is within this ellipse. I have found a good way to determine if a point is within an ellipse on a Cartesian grid, but don't know how to deal with latitude longitude points.
Any help would be appreciated.
-Cody O.
The standard way of doing this on a Cartesian plane would be with a ray-casting algorithm. Since you're on a sphere, you will need to use great circle distances to accurately represent the ellipse.
EDIT: The standard ray-casting algorithm will work on your ellipse, but its accuracy depends on a) how small your ellipse is, and b) how close to the equator it is. Keep in mind, you'd have to be aware of special cases like the date line, where it goes from 179 -> 180/-180 -> -179.
Since you already have a way to solve the problem on a cartesian grid, I would just convert your points to UTM coordinates. The points and lengths will all be in meters then and the check should be easy. Lots of matlab code is available to do this conversion from LL to UTM. Like this.
You don't mention how long the axes of the ellipse are in the description. If they are very long (say hundreds of km), this approach may not work for you and you will have to resort to thinking about great circles and so on. You will have to make sure to specify the UTM zone to which you are converting. You want all your points to end up in the same UTM zone or you won't be able to relate the points.
After some more research into my problem and posting in another forum I was able to figure out a solution. My ellipse is relatively small so I assumed it was a true (flat) ellipse. I was able to locate the lat lon of the foci of the ellipse then if the sum of the distances from the point of interest to each focus is less than 2a (the major axis radius), then it is within the ellipse. Thanks for the suggestions though.
-Cody
In my iPhone app, suppose I have the coordinates of the current location. I want to know the coordinates of a point 10km from here, and 30° North East for example. How do I calculate it? Thanks.
After typing a bunch of formulae out, I realized there's a site that already has it down, so I'm just going to link that instead, Calculate distance and bearing between two Latitude/Longitude points. The section titled "Destination point given distance and bearing from start point" is what you want. Just convert the degrees to radians (your bearing) and you'll be all set
I am struck with a problem. I want to convert the CMAttitude information of an iPhone to Altitude (0 to 90deg) and Azimuth (0 to 360 deg). I have googled around and hit some threads which discuss about it, but none of threads turn out with a positive answer and most of the articles discussing Quaternion and Euler angles are too much mathematics to stuff into my brain!
Is there some open source material which does this task easy? Or someone has written code to perform this conversion?
Edit:
First off, sorry for being so abstract!
Azimuth is the direction on the surface of the earth towards which the device is pointing. Like North = 0 deg, North East = 45deg, East = 90 deg, South = 180 deg and so on. Ranges between 0 deg to 360 deg:
Altitude is the angle made from the plane of the earth to an object in the sky:
Thanks,
Raj
Using CMDeviceMotion, you can get a CMAttitude object with "roll, pitch and yaw" - where for example, given a phone held in portrait mode "yaw" is "azimuth", "pitch" is the tilt of the phone with respect to ground, or zenith, and "roll" is about the vector pointing through the screen and not what you're interested in.
Things get a bit tricky because "azimuth" is a projection of the 3D magnetic vector (pointing towards the magnetic north pole) on to the flat "ground" plane, which changes depending on device orientation, but given this understanding of the terms, threads like this one should be much more understandable. If you only need your application to work in one orientation things get much simpler.
P.S. "altitude" is almost exclusively used to refer to elevation or height about a given reference (sea level, geodetic height etc). "Zenith" or "pitch" are preferable, and since you're on iOS, you should stick to their coordinate scheme: (lat, lon, alt), (pitch, yaw, roll).