What format are ATOC master stations file "Eastings" and "Northings" in? - coordinates

The Problem
I have downloaded the Master Stations Names File from the UK's Rail Delivery group.
I'm trying to map the dataset and would like to extract the locations. I tried using pyproj in my Python script, but the results are wrong.
The "Eastings" and "Northings" they provide don't seem to fit the National Grid. For example:
York station is given as "14596 64517". If these were epsg:27700, they would translate to -7.420284219986337 50.35401717940486, which is somewhere off the coast of Cornwall.
The documentation for the file says the following:
"Easting in units of 100m. Stations too far south
(Channel Islands) or too far north (Orkneys) or too far
west (west of Carrick on Shannon) have both their
Easting and Northing set to 00000. The most westerly
station in range, Carrick on Shannon, has value
10000. The most easterly station, Amsterdam, has
value 18690."
and
"Northing in units of 100m. Stations too far south
(Channel Islands) or too far north (Orkneys) or too far
west (west of Carrick on Shannon) have both their
Easting and Northing set to 00000.
The most
southerly station in range, Lizard (Bus), has value
60126. The most northerly station in range, Scrabster,
has value 69703."
but that still doesn't tell me the actual format they are in.
Google didn't help - I'm not even sure what to look for.
Question
What format are these coordinates in and how can I transform them into epsg:4326?

One would think that such a seemingly custom format would be better documented but there you go.
After much experimenting and playing with the coordinates given, I came up with the formula:
realEastings = (eastingsInData - 10000) * 100
realNorthings = (northingsInData - 60000) * 100
Why anyone would use this is beyond me, but hopefully this post can be useful for somebody else.

Related

Looking for free or paid FSA polygons of Canada for project. Statscan data is free but number of FSA is too low

I downloaded data showing polygons from statscan with 1643 fsa polygons (the first 3 letters of a postal code).
TWO datasets meant to be combined, hope this is clear.
DATA1 -provided to me as descriptors for specific FSA's
FSA
DATA PROVIDED
FSA1
text1
...
text ...
FSA667
text 1667
--------
--------
DATA2 - downloaded from statscan as .shp file
FSA
Polygon coords
FSA1
Cell 2
FSA1643
Cell 4
I am combining the polygons with another dataset to show layover data for each fsa. The problem is I was provided with data showing 1667 FSA's and I'm asked to produce a map that reflects their dataset (1667 items of layover information) when combined with an equal and matching number of polygons.
Effectively there are 1667-1643 = 24 missing FSA's as polygons.
Does anyone know a good source for FSA only? Other than stats canada I can't seem to find what I need. Paid and free.... I need to see what's out there and available.
Link to statscan https://www12.statcan.gc.ca/census-recensement/2021/geo/sip-pis/boundary-limites/index2021-eng.cfm?year=21
I am using leaflet.js to show the data but this is really a question about the datasets themselves. In summary I seek 1667 polygons represending canadian FSA's (forward sorting areas) and only can find 1643.
Thanks
I can successfully view and import the data I am using qgis, the issue is the data itself. I seek 1667 polygon coordinates not the 1643 I can only find online. Hopefully free... maybe paid.

pyephem - does the right ascension calculation for the sun account for the Equation of Time

I am looking to calculate the highest precision lat lon for the subsolar point, in a particular datetime moment, as is reasonably possible using pyephem, with the help of some other library(s) if they are needed.
Relevant context:
Anyone who has used pyephem, already knows that for certain calculations it requires certain setup values before computing body positions, those values including the datetime (epoch of the observation), the location of the observer, and of course, the body being investigated. Solutions for the subsolar point through the use of pyephem, that I have found online, show the time in utc as the time needed for the pyephem setup.
Remembering way back to my first exposure to astronomy, and to celestial navigation, utc is a variant of a mean day, compared to an actual solar day, where an actual solar day's duration throughout the year varies due to several factors of the nature of the earth's orbit. Because the length of an actual solar day varies throughout the year, for certain types of astronomical calculations, this requires the Equation of Time to more precisely map the actual solar day measurements to a mean and fixed 24 hour day system such as utc. Before the advent of sufficiently accurate 'pendulum movement' clock mechanisms, and now crystal controlled clock mechanisms, going back to when sundials were the accurate timepiece, the more sophisticated sundials included markings to apply a yearly approximation of this important Equation of Time, soon after it had been observed and definitively documented. Therefore, relevant to my question, since utc is a variant of mean day, and not the true solar day, but normalized to 24 hours exactly, there is this question now of how or if pyephem incorporates the Equation of Time in its right ascension solutions for the sun. At present, I imagine the EoT is required for accuracy, as I try to visualize the sun's position against the background of stars, as seen from the earth, as the earth revolves around the sun, with historically observed variations that are made available and useful and essential with the Equation of Time.
Summary then of my question:
If it is not necessary to explicitly enter an EoT value in pyephem, because it is not relevant for computing the most accurate subsolar point, please explain why. If it is relevant, as I presently think it is, please tell me if pyephem, in its right acension calculation of the sun (and other bodies), as a body, does in fact, apply the Equation of Time as appropriate. Does it do so transparently? Is there a way to input an explicit value for it, if such is known, an EoT value that might be more accurate or more up to date compared to what pyephem is using transparently?
Some initial research results that formed the question:
Upon doing a search through various search engines, I found several posts in topical forums that give what seems a very simple answer for finding the subsolar point. Finding the lattitude apears to be the less complicated part of the solution, being simply the computed declination. Finding the longitude is where the question arose in my thinking, and now I wonder if it is applicable for the declination as well, since using the properly precise time is essential for the most precise result of both declination(lat) and longitude of the subsolar point. I always applied the EoT from the Nautical Almanac, back when I was involved with celestial navigation.
Two links, specific to pyephem, present the same approach to the subsolar point solution. When the question(s) was first asked, Brandon Rhodes quickly presented the single line formula using pyephem's computing of the sun's right ascension. His was specifically the code for the longitude calculation in a more theoretical tone, without all the pyephem contextual details. Liam Kennedy presented a more complete context of python code, showing those additional pyephem details, so that one could 'copy and paste' the entire block of code, (needing only to add the import ephem and import datetime), and modify it as appropriate, which I also found to be a useful review. The code is from these links...
Computing sub-solar point
Confusion with using dec/ra to compute sub-lunar location
subsolar point:
Brandon's code
lon = body.ra - greenwich.sidereral_time()
Liam's code
sunLon = math.degrees(sun.ra - greenwich.sidereal_time() )
Nowhere in these two posts is there a mention of the Equation of Time, and yet a variant of mean day is being used as an input value here
greenwich.date = datetime.utcnow()
utc as a variant of mean day is EoT unaware, by its construction definition as a mean day, which then normally makes it a requirement to adjust it with the EoT for certain astronomical usefulness.
To further clarify this requirement, there are many navigation and astronomical references that go into considerable detail discussing it. But I will stick to refering to some forum posts such as the following:
https://forum.cosmoquest.org/showthread.php?55871-Finding-the-subsolar-point
specifically the post by grant hutchison 2007-mar-20, 04:33 pm
You can use the NOAA Solar Position Calculator, but it's kind of convoluted.
http://www.srrb.noaa.gov/highlights/sunrise/azel.html
note: the NOAA calculator, as of this writing, 2019-12-19, does have an input box where one is to enter the Equation of Time in minutes. That page has a link to a more updated calculator.
https://www.esrl.noaa.gov/gmd/grad/solcalc/
The more up to date page also calculates and displays the Equation of Time, clarifying its relevance. Now, continuing to quote Grant's post...
First, use the calculator to derive the Equation of Time and Solar Declination for the date and time you're interested in, at the location zero latitude and zero longitude, with no UTC offset.
The 2007 March equinox is at 21 March 00:08:30 UTC. Type that time and date into the calculator and, sure enough, you find the solar declination is zero: the sun is over the equator at that moment. For any other date and time, the solar declination will convert directly to the latitude of the subsolar point.
Now we need the longitude. First, work out the true solar time using the Equation of Time figure: it's -7.42 minutes in this case. That's the offset between the position of the mean sun and the real sun. Adding that figure to our UTC time tells us that the real sun is just 1.03 minutes past midnight (8.5-7.42) at the time of interest. Divide that figure by 60*24 (to get the fraction of a day) and multiply by 360 (to get degrees): that gives us 0.2575 degrees past midnight. So the sun will be on the noon meridian at 180-0.2575 degrees east = 179.7425 E. That's our longitude.
Combine the two, and the subsolar point is 0.0000N 179.7425E.
We can check that I haven't mixed my pluses and minuses by typing the derived coordinates of the subsolar point into the solar calculator (Lat 00:00:00, Lon -179:44:33), keeping the UTC offset at zero and the date and time at your time of interest, 21 March 00:08:30. That comes up with an Azimuth of zero and an Altitude of 89.98 degrees, confirming that we have the sun crossing the meridian within a couple of hundredths of a degree of directly overhead. Phew. It works, but it's a bit of a pain. Maybe someone can offer a calculator that will do more of the work for you.
And a followup post of his dated about an hour and a half later...
Some notes to the above, FWIW:
The difference between Dynamical Time and UTC this year is 65 seconds, so working from the Dynamical Time of the solstice we get the UTC time (to the nearest second) to be 00:07:25 UTC, which fits with G O R T's nearest-minute value, above.
The reason G O R T and I come up with a different subsolar longitude for the same time (00:07:00 UTC) is because of that pesky -7.42 minutes in the equation of time: although that time is after midnight at Greenwich, the real sun is still 42 seconds short of crossing the midnight line. That shifts the calculated subsolar point from the eastern to the western hemisphere. 7.42 minutes is equivalent to 1.855 degrees, which is exactly the difference between my calculated longitude of 179:53:42W and G O R T's of 178:15:00E.
My question is therefore based on this research, and based on my past experience with celestial navigation. I imagine that as vital as the Equation of Time might be to the problem, it would be incorporated into pyephem's calculation(s), since a mean day is input into pyephem's API. Seeing nowhere in these snippet solution postings where EoT is to be specified in the pyephem API, my assumption is that it would be internally and transparently implemented? I am not comfortable with this assumption, and so I have posted this question. Clarification would benefit the confidence of users, particularly newbies such as myself.
Update 12-20-2019:
I suspect the answer is yes, pyephem accounts for EoT, but it does not call it that? The way ephem, libastro, takes into account some other effect or relationship probably answers my question(s). I am reviewing:
https://rhodesmill.org/pyephem/radec
Needing to read it very slowly, while drawing some pictures, and waiting for an astronomy book so I can catch up on a very much misplaced education on this matter. I'm thinking that perhaps the term Equation of Time only has meaning in a narrow context of reconciling the solar day to a mean day metric, as experienced on earth, while pyephem solves in a broader context and uses more broadly applicable terminology, of which I need to be re-educated, which includes such resulting effects as the Equation of Time? Or I am only displaying my ignorance? Until I can more competently write my own answer, please do contribute any helpful comments or answers that can steer my study.
I think that your question, stated more briefly, is: does the libastro library that underlies PyEphem assume that the Earth’s orbit is a circle along which the Earth travels at a uniform rate? Because if it assumes a circular orbit and uniform rate for the Earth, then a correction ­— the Equation of Time — would need to appear for the fact that the Earth in fact varies its speed along its orbit.
I suggest that you can answer this question for yourself experimentally. If PyEphem assumes uniform circular motion for the Earth, then the number of degrees traveled by the Sun each day will be the same. Try looping over a long series of days. For the same time each day, ask the Sun for its right ascension and declination, and then use separation() to check the angle traveled between those points.
If the angle traveled by the Sun is the same each day, then PyEphem is modeling the Sun’s motion very poorly and you will need to apply an Equation of Time correction to get its true position.
But if the daily angle is varying — small in July, large in January — then PyEphem must be modeling the Earth’s motion more accurately. If you dig into the source code, you will find that its model is called the VSOP87 model of predicting where the Earth and Sun are. Your own experiments should show how the model behaves as the Sun travels the sky through the year.

Different results using JCoord and GeoTools

I have been trying to covert Easting and Northing values to lat/lon using JCoord and GeoTools. Problem is I am getting different results for each library using the same Easting & Northing.
The code I am using is the code provided in the main answer and GeoTools answer provided in this question.
convert latitude and longitude to northing and easting in java?
The easting I am using is : 393339
The Northing I am using is : 806179
The coordinates Jcoord is providing are (57.14645296506957, -2.111698674790966)
The coordinates GeoTools is providing are [57.146449494619105, 2.111714868502565]
They seem to lose accuracy around the 4th digit and I'm wondering which one is right??
Thanks
Assuming that these are OS Eastings and Northings (which seems likely based on your lat/lon values) then they are accurate to 1m (as 6 figure grid references). Based on the values given by this calculator a degree of latitude is around 100 km so the 4th decimal point is roughly 10m or about the accuracy you can expect.
To get more precision out of the calculation you need to make sure of the ToWGS84 parameters being used in each calculation - for GeoTools you can query the projection to find this value, I expect JCoord has a similar operation.
Note in GeoTools the towgs parameter may vary depending on which referencing factory you are using, I believe that gt-epsg-hsql is more accurate than the gt-epsg-wkt.

How to generate a geographical heat map in MATLAB? (worldwide, country-level granularity)

I want to create a geographical heat map like the following in MATLAB:
Each color is based on a list of country with a percentage associated to each of them:
Country with Codes: % of Hits
United States (US): 36.29%
India (IN): 18.24%
United Kingdom (GB): 12.93%
Spain(ES): 8.22%
Australia (AU): 3.32%
Canada (CA): 3.05%
Germany (DE): 2.49%
Netherlands (NL): 1.66%
Israel (IL): 1.39%
China (CN): 0.83%
How can I do it in MATLAB?
You should refer to some Geo Science toolboxes, like this one.
Also this kind of visualization is trivial in D3.js. So you may also output the data into a json or so and plot it with D3.js.
Maybe this can help ? the Figure 1 shows some
results similar to your needs ?
http://www4.fe.uc.pt/spatial/doc/lecture2.pdf
Please note, when I use google API I do save
the results as very high resolution image,
which are good enough for scientific publication.

How do you figure out what the neighboring zipcodes are?

I have a situation that's similar to what goes on in a job search engine where you type in the zipcode where you're searching for a job and the app returns jobs in that zipcode as well as in zipcodes that are 5, 10, 15, 20 or 25 miles from that zipcode, depending on preferences set by the user.
How would you calculate the neighboring locations for a zipcode?
You need to get a list of zip codes with associated longitude / latitude coordinates. Google it - there are plenty of providers.
Then take a look at this question for an algorithm of how to calculate the distance
I don't know if you can count on geonames.org to be around for the life of your app but you could use a web service like theirs to avoid reinventing the wheel.
http://www.geonames.org/export/web-services.html
I wouldn't calculate it, I would stored it as a fixed table in the database (only to change when the allocation of ZIP codes changes in a country). Make a relationship "is_neighbor_zip", which has pairs (smaller, larger). To determine whether two codes are neighboring, check in the table for specific pair. If you want all neighboring zips, it might be better to make the table symmetric.
You need to use a GIS database and ask it for ZIP codes that are nearby your current location.
You cannot simply take the ZIP code number and apply some mathematical calculations to find other nearby ZIP codes. ZIP codes are not as geographically scattered as area codes in the US, but they are not a coordinate system.
The only exception is that the ZIP+4 codes are sub-sections of the larger ZIP code. You can assume that any ZIP+4 codes that have the same ZIP code are close to each other.
I used to work on rationalizing the ZIP code handling at a company, here are some practical notes I made:
Testing ZIP codes
Hopefully has other useful info.
Whenever you create a zipcode, geocode it (e.g. google geocoder api, saving the latitude and logitude) then google the haversine formular, this will calculate the distance (as the crow flies) from a reference point, which could also be geocoded if it is a town or zipcode.
To clarify some more:
When you are retrieving records based on their location, you need to compare each longitude and latitude DECIMAL with a reference point (your users geo-coded postcode or town name)
You can query:
SELECT * FROM photos p WHERE p.long < 60 AND p.long > 50 AND p.lat > -10 AND p.lat > 10
To find all UK photos etc because the uk is between 50 and 60 degrees longitude and +-10 latitude (i might have switched long with lat, i'm fuzzy on this)
If you want to find the distance then you will need to google the haversine formula and plug in your reference values.
Hope this clears things up a little bit more, leave a comment if you need details