postgis shape file import problems - postgresql

Hi I'm trying to import a shape file from
http://www.nyc.gov/html/dcp/html/bytes/bytesarchive.shtml
into a postgis database. the above files creates MULTIPOLYGONS when i import using shp2pgsql.
then i'm trying to simply determine if lat/long points are contained in my multipolygons
however my select's are not working, and when i print out the poitns of my the_geom column it seems to be very broken.
select st_astext(geom) from (select (st_dumppoints(the_geom)).* from nybb where borocode =1) foo;
gives the result...
st_astext
------------------------------------------
POINT(1007193.83859999 257820.786899999)
POINT(1007209.40620001 257829.435100004)
POINT(1007244.8654 257833.326199993)
POINT(1007283.3496 257839.812399998)
POINT(1007299.3502 257851.488900006)
POINT(1007320.1081 257869.218500003)
POINT(1007356.64669999 257891.055800006)
POINT(1007385.6197 257901.432999998)
POINT(1007421.94509999 257894.084000006)
POINT(1007516.85959999 257890.406100005)
POINT(1007582.59110001 257884.7861)
POINT(1007639.02150001 257877.217199996)
POINT(1007701.29170001 257872.893099993)
...
for points in nyc, this is very off.. what am i doing wrong?

The points are not of. The spatial data that is referred to is NOT in lat/long. This is why numbers are different from what you expect. If you need it to be in long/lat it must be reprojected. See more here: http://postgis.refractions.net/news/20020108/
The projection of the data seems to be in the NAD_1983_StatePlane_New_York_Long_Island_FIPS_3104_Feet coordinate system (according to the metadata - see code.).
<spref>
<horizsys>
<planar>
<planci>
<plance Sync="TRUE">coordinate pair</plance>
<coordrep>
<absres Sync="TRUE">0.000000</absres>
<ordres Sync="TRUE">0.000000</ordres>
</coordrep>
<plandu Sync="TRUE">survey feet</plandu>
</planci>
<mapproj><mapprojn Sync="TRUE">Lambert Conformal Conic</mapprojn><lambertc><stdparll Sync="TRUE">40.666667</stdparll><stdparll Sync="TRUE">41.033333</stdparll><longcm Sync="TRUE">-74.000000</longcm><latprjo Sync="TRUE">40.166667</latprjo><feast Sync="TRUE">984250.000000</feast><fnorth Sync="TRUE">0.000000</fnorth></lambertc></mapproj></planar>
<geodetic>
<horizdn Sync="TRUE">North American Datum of 1983</horizdn>
<ellips Sync="TRUE">Geodetic Reference System 80</ellips>
<semiaxis Sync="TRUE">6378137.000000</semiaxis>
<denflat Sync="TRUE">298.257222</denflat>
</geodetic>
<cordsysn>
<geogcsn Sync="TRUE">GCS_North_American_1983</geogcsn>
<projcsn Sync="TRUE">NAD_1983_StatePlane_New_York_Long_Island_FIPS_3104_Feet</projcsn>
</cordsysn>
</horizsys>
</spref>
If you work much with spatial data I suggest that you read more about map projection.

I think this is not issue with PostGIS. I checked input esri Shape file nybb.shp with AvisMap Free Viewer and as you see points are weird itself:
However there is something interesting in nybb.shp.xml metadata file:
<spdom>
<bounding>
<westbc Sync="TRUE">-74.257465</westbc>
<eastbc Sync="TRUE">-73.699450</eastbc>
<northbc Sync="TRUE">40.915808</northbc>
<southbc Sync="TRUE">40.495805</southbc>
</bounding>
<lboundng>
<leftbc Sync="TRUE">913090.770096</leftbc>
<rightbc Sync="TRUE">1067317.219904</rightbc>
<bottombc Sync="TRUE">120053.526313</bottombc>
<topbc Sync="TRUE">272932.050103</topbc>
</lboundng>
</spdom>
I am not familiar with those toolkit (ESRI ArcCatalog), but most probably you need to rescale your points after import using that metadata.

Related

osmnx boundaries and admin_level

I hope someone here can help me to retrieve the correct administration level(s) from OSM. I am using the following code, but admin_level seems to be ignored:
tags = {"boundary":"administrative","admin_level":"4" }
gdf =ox.geometries.geometries_from_bbox(51.5, 51.0, 11.7, 11.2, tags)
gdf.shape
The bounding box seems to be used as a polygon to create an intersection with all the boundaries in the OSM database, the first tag is working because only administrative boundaries are returned, but the filter on level is ignored (gdf["admin_level"].head() shows level 6).
I would like to understand what I am doing wrong, and how I can use this package better; it seems like a very useful library.
Thanks,
Gijs
Result using the bounding box:
OK, rereading the documentation again made me realize that osmnx is using [OR] statements and not [AND] statements as I was presuming; removing the boundary request from the query is indeed giving only admin_level:4 results.
tags (dict) – Dict of tags used for finding objects in the selected area. Results returned are the union, not the intersection of each individual tag.
Some additional code: https://i.stack.imgur.com/Fw840.png

How can I get a list of bridges with their location (latitude and longitude) from an OSM file?

maybe this query may be a bit trivial or perhaps laborious, but for a project I need to obtain the bridges that exist in an osm file along with its location (latitude and longitude).
Reading the openstreetmap wiki, I see that there is a procedure using osmosis but I do not know if I will actually get the information as follows:
Name of the bridge | latitude | longitude
bin / osmosis.bat --rx brandenburg.osm.bz2 --bp file = "city.poly" --tf accept-ways highway=motorway_link,motorway --way-key-value keyValueList="bridge.yes" --used-node --write-xml brdg_autob.osm
Thanks in advance
Pablo
The output will be OSM XML and not plaintext.
Also, most bridges in OSM are mapped as ways. A way consists of multiple lat/lons represented as nodes. If you need a single lat,lon pair then you have to calculate the bridge center yourself.
Additionally, not all bridges are tagged as bridge=yes. See bridge in the OSM wiki for a list of commonly used tags, such as bridge=viaduct, bridge=aqueduct, bridge=boardwalk and so on.
You won't exactly get the format you described. However with some little work you can transform OSM XML into your format.

How can I search for elements within a polygon with Overpass?

I am new to Overpass API and GIS in general.
Is there an easy way to export all buildings in a specific region using coordinates to specify the polygon? I couldn't find a solution using the wiki and google so far.
I have large sets of coordinates which are determining some medium-voltage grids.
Or is there another tool I could use?
I want to use the polygon- coordinates of the exported buildings in matlab.
Thanks for your help!
Overpass API provides the (poly: ) filter to query objects inside a given polygon. See the documentation in the wiki for details.
Buildings in a given polygon can be queried as follows:
way[building](poly:"50.7 7.1 50.7 7.12 50.71 7.11");
(._;>;);
out meta;
Due to a recent memory limitation, you might have to either add a [maxsize: xxx] setting:
[maxsize:2073741824];
way[building](poly:"50.7 7.1 50.7 7.12 50.71 7.11");
(._;>;);
out;
or resort to the following workaround to force another evaluation sequence:
way(poly: "50.7 7.1 50.7 7.12 50.71 7.11");
way._[building];
(._;>;);
out meta;

geom format in postGIS use to print in google maps android

a greeting, a friend happened to me a route of transportation and I step in the following format:
0102000020E610000049020000F2D077B7B23A53C0F03504C7651428C07E703E75AC3A53C07E37DDB2431428C07E8AE3C...
He told me it was a MultiLineString. I used the PostGIS function:
SELECT St_asewkt ('0102000020E6100000810200 .....');
with this i supposedly should be able to get the wkt format that I can
get the coordinates and use it on google maps api to paint a path in android, but when using it throws me empty. Help me please
I need to get something like MULTILINESTRING((0 0,1 1,1 2),(2 3,3 2,5 4)) ... to get lat lng coordinates to paint it in google maps api with android.
To summarize what I have is this value
0102000020E6100000810200 ..... and would like to paint it in android with google maps api, that's why I try to get the coordinates that function.
What you have is Well know binary, WKB, which is explained on this page along with its companion well known text (WKT), but in an ASCII (hex) representation, see the docs
If you do something like:
select st_setsrid(st_makepoint(50, -2),4326);
you will see 0101000020E6100000000000000000494000000000000000C0, similar to what you have.
You can insert these directly into a db with
create table test (g geometry);
insert into test(g) values(ST_GeomFromEWKB(E'\\x0101000020E6100000000000000000494000000000000000C0')
where the E'\\x indicates that you are inserted a hex string, see binary format docs.
If you now do
select g from test;
you will get your WKB back and if you do
select st_astext(g) from test;
you will see the more human-readable WKT format.
The best way to load data in the correct format is to use:
COPY table_name FROM your_file.csv CSV;
where your wkb would be, as is, unquoted, in the your_file.csv.

Grib2 to PostGIS raster -- anyone get this to work?

I have an application for which I need to import U.S. National Weather Service surface analyses, which are distributed as grib2 files. I want to pull those into PostGIS 2.0 rasters, do some calculations and modeling, and display the data and model results in GeoServer.
Since grib2 is a GDAL-supported format, the supplied raster2pgsql utility should be able to slurp a grib2 right into PostGIS-compatible SQL, and once it's there, GeoServer ought to be able to handle it. However, I'm running into problems which have no obvious solutions -- not obvious to me, at any rate! Raster2pgsql runs, apparently without errors, producing SQL, and running the SQL creates what looks very much like a raster. But GeoServer can't display it -- the bounds, in particular, come out looking weird (0,0 -1,-1) and "preview layer" just throws a NullPointerException.
Has anyone been down this road already? I've got issues as basic as not knowing what the SRID should be for the data (4326, perhaps?). I don't expect anyone to debug my problems for me but if someone has already got this toolchain working, or part of it, I can plug known-good things in and see what I can discover.
TIA,
rw
Updated: Per Mike, here is the coordinate-system stuff from one of the files; I elided the other 749 bands in the output from "gdalinfo". Note that the filename is different -- I found out by running "gdalinfo" on my original file that something was wrong with it, gdalinfo couldn't read it. New (35MB!) file here.
Gdalinfo output:
Driver: GRIB/GRIdded Binary (.grb)
Files: ruc2.t00z.bgrb13anl.grib2
Size is 451, 337
Coordinate System is:
PROJCS["unnamed",
GEOGCS["Coordinate System imported from GRIB file",
DATUM["unknown",
SPHEROID["Sphere",6371229,0]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433]],
PROJECTION["Lambert_Conformal_Conic_2SP"],
PARAMETER["standard_parallel_1",25],
PARAMETER["standard_parallel_2",25],
PARAMETER["latitude_of_origin",0],
PARAMETER["central_meridian",265],
PARAMETER["false_easting",0],
PARAMETER["false_northing",0]]
Origin = (-3332155.288903323933482,6830293.833488883450627)
Pixel Size = (13545.000000000000000,-13545.000000000000000)
Corner Coordinates:
Upper Left (-3332155.289, 6830293.833) (139d51'22.04"W, 54d10'20.71"N)
Lower Left (-3332155.289, 2265628.833) (126d 6'34.06"W, 16d 9'49.48"N)
Upper Right ( 2776639.711, 6830293.833) ( 57d12'21.76"W, 55d27'10.73"N)
Lower Right ( 2776639.711, 2265628.833) ( 68d56'16.73"W, 17d11'55.33"N)
Center ( -277757.789, 4547961.333) ( 98d 8'30.73"W, 39d54'5.40"N)
Band 1 Block=451x1 Type=Float64, ColorInterp=Undefined
Description = 1[-] HYBL="Hybrid level"
Metadata:
GRIB_UNIT=[Pa]
GRIB_COMMENT=Pressure [Pa]
GRIB_ELEMENT=PRES
[Etc., Etc., for all 750 bands]
I hope this helps, at least those comming to this thread.
Bear in mind that GeoServer, while being capable of loading Raster data from PostGIS, the default PostGIS "importing" module is ONLY available for vector data, that's why you get those odd bounds (-1 -1 0 0).
You'll have to add ImageMosaicJDBC plugin to your geoserver installation, follow steps here!
http://docs.geoserver.org/latest/en/user/tutorials/imagemosaic-jdbc/imagemosaic-jdbc_tutorial.html
Got an excellent answer to my problem here. Putting it in as a separate answer.
He recommended using gdalwarp to pull the GRIB2 file into a known SRID, thus:
gdalwarp -t_srs EPSG:4326 original_file.grib2 4326_file.grib2
Then, raster2pgsql works just fine, e.g.
raster2pgsql -M -a 4326_file.grib2 some_sql.sql