Grib2 to PostGIS raster -- anyone get this to work? - postgresql

I have an application for which I need to import U.S. National Weather Service surface analyses, which are distributed as grib2 files. I want to pull those into PostGIS 2.0 rasters, do some calculations and modeling, and display the data and model results in GeoServer.
Since grib2 is a GDAL-supported format, the supplied raster2pgsql utility should be able to slurp a grib2 right into PostGIS-compatible SQL, and once it's there, GeoServer ought to be able to handle it. However, I'm running into problems which have no obvious solutions -- not obvious to me, at any rate! Raster2pgsql runs, apparently without errors, producing SQL, and running the SQL creates what looks very much like a raster. But GeoServer can't display it -- the bounds, in particular, come out looking weird (0,0 -1,-1) and "preview layer" just throws a NullPointerException.
Has anyone been down this road already? I've got issues as basic as not knowing what the SRID should be for the data (4326, perhaps?). I don't expect anyone to debug my problems for me but if someone has already got this toolchain working, or part of it, I can plug known-good things in and see what I can discover.
TIA,
rw
Updated: Per Mike, here is the coordinate-system stuff from one of the files; I elided the other 749 bands in the output from "gdalinfo". Note that the filename is different -- I found out by running "gdalinfo" on my original file that something was wrong with it, gdalinfo couldn't read it. New (35MB!) file here.
Gdalinfo output:
Driver: GRIB/GRIdded Binary (.grb)
Files: ruc2.t00z.bgrb13anl.grib2
Size is 451, 337
Coordinate System is:
PROJCS["unnamed",
GEOGCS["Coordinate System imported from GRIB file",
DATUM["unknown",
SPHEROID["Sphere",6371229,0]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433]],
PROJECTION["Lambert_Conformal_Conic_2SP"],
PARAMETER["standard_parallel_1",25],
PARAMETER["standard_parallel_2",25],
PARAMETER["latitude_of_origin",0],
PARAMETER["central_meridian",265],
PARAMETER["false_easting",0],
PARAMETER["false_northing",0]]
Origin = (-3332155.288903323933482,6830293.833488883450627)
Pixel Size = (13545.000000000000000,-13545.000000000000000)
Corner Coordinates:
Upper Left (-3332155.289, 6830293.833) (139d51'22.04"W, 54d10'20.71"N)
Lower Left (-3332155.289, 2265628.833) (126d 6'34.06"W, 16d 9'49.48"N)
Upper Right ( 2776639.711, 6830293.833) ( 57d12'21.76"W, 55d27'10.73"N)
Lower Right ( 2776639.711, 2265628.833) ( 68d56'16.73"W, 17d11'55.33"N)
Center ( -277757.789, 4547961.333) ( 98d 8'30.73"W, 39d54'5.40"N)
Band 1 Block=451x1 Type=Float64, ColorInterp=Undefined
Description = 1[-] HYBL="Hybrid level"
Metadata:
GRIB_UNIT=[Pa]
GRIB_COMMENT=Pressure [Pa]
GRIB_ELEMENT=PRES
[Etc., Etc., for all 750 bands]

I hope this helps, at least those comming to this thread.
Bear in mind that GeoServer, while being capable of loading Raster data from PostGIS, the default PostGIS "importing" module is ONLY available for vector data, that's why you get those odd bounds (-1 -1 0 0).
You'll have to add ImageMosaicJDBC plugin to your geoserver installation, follow steps here!
http://docs.geoserver.org/latest/en/user/tutorials/imagemosaic-jdbc/imagemosaic-jdbc_tutorial.html

Got an excellent answer to my problem here. Putting it in as a separate answer.
He recommended using gdalwarp to pull the GRIB2 file into a known SRID, thus:
gdalwarp -t_srs EPSG:4326 original_file.grib2 4326_file.grib2
Then, raster2pgsql works just fine, e.g.
raster2pgsql -M -a 4326_file.grib2 some_sql.sql

Related

Is it possible to merge raster bands from several folders using GDAL?

I have two folders containing about 15 000 .tif files. Each file in the first folder is a raster with 5 bands, named AA_"number" meaning it looks like
AA_1.tif,
AA_2.tif,
...,
AA_15000.tif.
Each file in the second folder is a raster with 2 bands named BB_"number" and looks like
BB_1.tif,
BB_2.tif,
...,
BB_15000.tif.
My goal is to add bands 1-3 from first file from folder AA with band 1 from the first file in folder BB to create a 4 band raster, and make 15000 4 band rasters. After doing some research and testing things out in QGIS I believe the tool Merge from GDAL could solve this task, but I have not been able make it find the right files in different folders. And as I have 2x 15 000 files, it is not possible to do this selection manually. Is there anyone who know a smart solution to this, preferably using GDAL or QGIS?
There are many ways to do this, and it really depends on what the exact use case is. Like the type of analysis/visualization that needs to be done on the result.
With this many files, it could for example be nice to merge them using a VRT. That will avoid creating redundant data, but whether that's actually the best solution depends. Just stacking them in a new tiff-file would of course also work.
Unfortunately, creating a VRT using gdalbuildvrt / gdal.BuildVRT is not possible with multi-band inputs.
If your inputs are homogeneous in terms of properties, it should be fairly simple to set up a template where you fill in the file locations and write the VRT to disk. For more inputs with heterogeneous properties it might still be possible, but you'll have to be careful to take it all into account.
Conceptually such a VRT would look something like:
<VRTDataset rasterXSize="..." rasterYSize="...">
<SRS>...</SRS>
<GeoTransform>....</GeoTransform>
<VRTRasterBand dataType="..." band="1">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>1</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="2">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>2</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="3">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>3</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="4">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/bb_folder/bb_file1.tif</SourceFilename>
<SourceBand>1</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
</VRTDataset>
You can first use gdalbuildvrt on some of your files to find all the properties that need to be filled in, like projection, pixel dimensions etc. That will work, but gdalbuildvrt will only be able to take the first band from the inputs. If all bands have homogeneous properties (like nodata value etc), that should be fine as a reference.

osmnx boundaries and admin_level

I hope someone here can help me to retrieve the correct administration level(s) from OSM. I am using the following code, but admin_level seems to be ignored:
tags = {"boundary":"administrative","admin_level":"4" }
gdf =ox.geometries.geometries_from_bbox(51.5, 51.0, 11.7, 11.2, tags)
gdf.shape
The bounding box seems to be used as a polygon to create an intersection with all the boundaries in the OSM database, the first tag is working because only administrative boundaries are returned, but the filter on level is ignored (gdf["admin_level"].head() shows level 6).
I would like to understand what I am doing wrong, and how I can use this package better; it seems like a very useful library.
Thanks,
Gijs
Result using the bounding box:
OK, rereading the documentation again made me realize that osmnx is using [OR] statements and not [AND] statements as I was presuming; removing the boundary request from the query is indeed giving only admin_level:4 results.
tags (dict) – Dict of tags used for finding objects in the selected area. Results returned are the union, not the intersection of each individual tag.
Some additional code: https://i.stack.imgur.com/Fw840.png

Why FastText test of a model return only 1 exemple when my test file contains 135

I'm trying to test the model (model.bin) i've made with fastText on a test file (test.txt). In this test file, i have 135 labelised data. I'm expecting from fastText to test my model on this number of example, but instead, it only test it over 1 example. Where does come from this problem ?
I've already tried to do such a thing with another model and another testing file and all worked nicely.
this is how I test my model. model_baby.bin is the model, and test.data.txt is my testing file.
./fasttext test model_baby.bin test.data.txt
N 1
P#1 1
R#1 0.0164
Number of examples: 1
And here is an extract from my testing file
__label__4.0 I love the fact you can hide your stuff. Only down is that the straps to hold it at midpoint and bottom could be better designed for your car. It's got plenty of room which is great. __label__5.0 This hid our ipad wonderfully. Especially for those quick stops where we all had jump out and use the restroom. It zipped, folded and held all our stuff for the kids in the back seat. __label__3.0
As i have more than 1 labelised example in my testing file, I expect the output "Number of examples: " to be at least more than 1 but the actual one is "1"
From the official documentation (https://fasttext.cc/docs/en/supervised-tutorial.html): Each line of the text file contains a list of labels, followed by the corresponding document. All the labels start by the __label__ prefix, which is how fastText recognize what is a label or what is a word.
I don't understand very much your extract. I think it should be like this:
__label__4.0 I love the fact you can hide your stuff. Only down is that the straps to hold it at midpoint and bottom could be better designed for your car. It's got plenty of room which is great.
__label__5.0 This hid our ipad wonderfully. Especially for those quick stops where we all had jump out and use the restroom. It zipped, folded and held all our stuff for the kids in the back seat.
__label__3.0 ...

How to read info on voltage/beam energy, imaging mode, acquisition date/timestamp, etc. from image meta-data? (Tags)

DM scripting beginner here, almost no programming skills.
I would like to know the commands to access all the metadata of DM images/spectra.
I realized that all my STEM images at 80 kV taken between 2 dates (let's say 02.11.2017-05.04.2019) have the scale calibration wrong by the same factor (scale of all such images needs to be multiplied by 1.21).
I would like to write a script which multiplies the scale value by a factor only for images in scanning mode at 80 kV taken during a period for all images in a folder with subfolders or for all images opened in DM and save the new scale value.
I checked this website http://digitalmicrograph-scripting.tavernmaker.de/other%20resources/Old-DMHelp/AllFunctions.html but only found how to call the scale value (ImageGetDimensionCalibration). I have a general idea how to write the script based on other scripts if I find out how to call the metadata.
If anyone can write the whole script for me I would greatly appreciate your effort.
All general meta-data is organized in the image tag-structure
You can see this, if you open the Image Display Info of an image. (Via the menu, or by pressing CTRL + D) and then browse to the "Tags" section:
All info on the right are image tags and they are organized in a hierarchical tree.
How this tree looks like, and what information is written where, is totally open and will depend on what GMS version you are using, how the hardware is configured etc. Also custom scripts might alter this information.
So for a scripting start, open the data you want to modify and have a look in this tree.
Hint: The following min-script can be useful. It opens a tag-browsing window for the front-most image but as a modeless dialog (i.e. you can keep it open and interact with other parts):
GetFrontImage().ImageGetTagGroup().TagGroupOpenBrowserWindow(0)
The information you need to check against is most probably found in the Microscope Info sub-tree. Here, usually all information gathered from the microscope during acquisition is stored. What is there, will depend on your system and how it is set up.
The information of the STEM image acquisition - as far as the scanning engine and detector is concerned - is most probably in the DigiScan sub-tree.
The Data Bar sub-tree usually contains date and time of creation etc.
Calibration values are not stored in the image tag-structure
What you will not find in this tag-structure is the image calibration, i.e. the values actually used by DM to display calibrated values. These values are "one level up" so to speak here:
This is important to know in the following for your script, because you will need different commands for both the "meta-data" from the tags, and the "calibration" you want to change.
Accessing meta-data by script
The script-commands you need to read from the tags are all described in the F1 help documentation here:
Essentially, you need a command to get the "root" TagGroup of an image, which is ImageGetTagGroup() and then you traverse within this tree.
This might seem confusing - because there are a lot of slightly different commands for the different types of stored tags - but the essential bits are easy:
All "Paths" through the tree are just the individual names (typed exactly)
For each "branch" you have to use a single colon :
The commands to set/get a tag-value all require as input the "root" tagGroup object and the "path" as a string. The get commands require a variable of matching type to store the value in, the set commands need the value which should be written.
= The get commands themeselves return true or false depending on whether or not a tag-path could be found and the value could be read.
So the following script would read the "Imaging Mode" from the tags of the image shown as example above:
string mode
GetFrontImage().ImageGetTagGroup().TagGroupGetTagAsString( "Microscope Info:Imaging Mode", mode )
OKDialog( "Mode: " + mode )
and in a little more verbose form:
string mode // variable to hold the value
image img // variable for the image
string path // variable/constant to specify the where
TagGroup tg // variable to hold the "tagGroup" object
img := GetFrontImage() // Use the selected image
tg = img.ImageGetTagGroup() // From the image get the tags (root)
path = "Microscope Info:Imaging Mode" // specify the path
if ( tg.TagGroupGetTagAsString( path, mode ) )
OKDialog( "Mode: " + mode )
else
Throw( "Tag not found" )
If the tag is not a string but a value, you will need the according commands, i.e.
TagGroupGetTagAsNumber().

postgis shape file import problems

Hi I'm trying to import a shape file from
http://www.nyc.gov/html/dcp/html/bytes/bytesarchive.shtml
into a postgis database. the above files creates MULTIPOLYGONS when i import using shp2pgsql.
then i'm trying to simply determine if lat/long points are contained in my multipolygons
however my select's are not working, and when i print out the poitns of my the_geom column it seems to be very broken.
select st_astext(geom) from (select (st_dumppoints(the_geom)).* from nybb where borocode =1) foo;
gives the result...
st_astext
------------------------------------------
POINT(1007193.83859999 257820.786899999)
POINT(1007209.40620001 257829.435100004)
POINT(1007244.8654 257833.326199993)
POINT(1007283.3496 257839.812399998)
POINT(1007299.3502 257851.488900006)
POINT(1007320.1081 257869.218500003)
POINT(1007356.64669999 257891.055800006)
POINT(1007385.6197 257901.432999998)
POINT(1007421.94509999 257894.084000006)
POINT(1007516.85959999 257890.406100005)
POINT(1007582.59110001 257884.7861)
POINT(1007639.02150001 257877.217199996)
POINT(1007701.29170001 257872.893099993)
...
for points in nyc, this is very off.. what am i doing wrong?
The points are not of. The spatial data that is referred to is NOT in lat/long. This is why numbers are different from what you expect. If you need it to be in long/lat it must be reprojected. See more here: http://postgis.refractions.net/news/20020108/
The projection of the data seems to be in the NAD_1983_StatePlane_New_York_Long_Island_FIPS_3104_Feet coordinate system (according to the metadata - see code.).
<spref>
<horizsys>
<planar>
<planci>
<plance Sync="TRUE">coordinate pair</plance>
<coordrep>
<absres Sync="TRUE">0.000000</absres>
<ordres Sync="TRUE">0.000000</ordres>
</coordrep>
<plandu Sync="TRUE">survey feet</plandu>
</planci>
<mapproj><mapprojn Sync="TRUE">Lambert Conformal Conic</mapprojn><lambertc><stdparll Sync="TRUE">40.666667</stdparll><stdparll Sync="TRUE">41.033333</stdparll><longcm Sync="TRUE">-74.000000</longcm><latprjo Sync="TRUE">40.166667</latprjo><feast Sync="TRUE">984250.000000</feast><fnorth Sync="TRUE">0.000000</fnorth></lambertc></mapproj></planar>
<geodetic>
<horizdn Sync="TRUE">North American Datum of 1983</horizdn>
<ellips Sync="TRUE">Geodetic Reference System 80</ellips>
<semiaxis Sync="TRUE">6378137.000000</semiaxis>
<denflat Sync="TRUE">298.257222</denflat>
</geodetic>
<cordsysn>
<geogcsn Sync="TRUE">GCS_North_American_1983</geogcsn>
<projcsn Sync="TRUE">NAD_1983_StatePlane_New_York_Long_Island_FIPS_3104_Feet</projcsn>
</cordsysn>
</horizsys>
</spref>
If you work much with spatial data I suggest that you read more about map projection.
I think this is not issue with PostGIS. I checked input esri Shape file nybb.shp with AvisMap Free Viewer and as you see points are weird itself:
However there is something interesting in nybb.shp.xml metadata file:
<spdom>
<bounding>
<westbc Sync="TRUE">-74.257465</westbc>
<eastbc Sync="TRUE">-73.699450</eastbc>
<northbc Sync="TRUE">40.915808</northbc>
<southbc Sync="TRUE">40.495805</southbc>
</bounding>
<lboundng>
<leftbc Sync="TRUE">913090.770096</leftbc>
<rightbc Sync="TRUE">1067317.219904</rightbc>
<bottombc Sync="TRUE">120053.526313</bottombc>
<topbc Sync="TRUE">272932.050103</topbc>
</lboundng>
</spdom>
I am not familiar with those toolkit (ESRI ArcCatalog), but most probably you need to rescale your points after import using that metadata.