How to create a new SRID in Postgresql/Postgis? - postgresql

I have some extravagant local spatial reference system and I have a lot of data stored in some old legacy system. Now I want to import this data to my Postgresql/Postgis database. On the client side I'm using JavaScript OpenLayers 3 library (if it matters), on the server side I'm used to storing geometry data with srid 3857, so my tables with layer data have such constraints:
CONSTRAINT enforce_dims_geom_layer_1_ CHECK (st_ndims(geom) = 2),
CONSTRAINT enforce_srid_geom_layer_1_ CHECK (st_srid(geom) = 3857)
So, if I have this legacy data, with some coordinates in a local reference system, how can I approach this problem to get a formula like:
+proj=longlat +ellps=bessel +towgs84=595.48,121.69,515.35,4.115,-2.9383,0.853,-3.408 +no_defs

Have a look at the public.spatial_ref_sys table. There the SRIDs are defined and you can insert your new SRID. The column proj4text includes the formulas.

The website https://epsg.io will generate an insert string for your SRID if you can find the page for your desired EPSG code. Once you find the page for the code you want, scroll down to "Export", below that on the left set it to "PostGIS". You can then "Copy TEXT", and paste that into your terminal or whatever you use to interact with your database.

Related

ST_TRANSFORM not working correctly with non-standert srid

I have two tables with geometry data. One table has MULTIPOLYGON geometry in non-standard srid and another has POINT geometry in 4326. I have a spatial_ref_sys table with a correct description for my srid. Then I want to use st_contains for geometry from these tables. I use st_transform to convert geometry from one table to srid from another, but the function returns geometry under Africa (all geometry is on the territory of Europe). Also, I want to admit that I use proj description from QGIS, and in this program, all works correctly. I use PostGIS 3.2. Here is my code for st_contains:
select st_contains(st_transform(geom_multipolygon_in_4326, SRID_from_another_table), geom_point_in_non_standart_srid);
Here is my proj:
+proj=tmerc +lat_0=0 +lon_0=30 +k=1 +x_0=-10000 +y_0=-5540000 +ellps=krass +towgs84=23.92,-141.27,-80.9,0,0.35,0.82,-0.12000000004786 +units=m +no_defs
However, when I use PostGIS 2.2 all works correctly with an identical proj description. I don't understand what the problem is. Maybe someone can help me, because I only found a solution by creating a layer in qgis and then importing it into the database, but this is not a solution for me, because the point geometry is formed through a query to Google and I still need to transform it using st_transform.

Get PostGIS geometry field on Drill

I have a table with a geometry column and if I query it using PostGIS, it shows the records right:
PostGIS query image:
The problem is when I execute the query using Apache Drill, because it shows all the records fine except the geometry one, it shows as null.
Drill query image:
Reviewing the logs, it shows the following error:
WARN o.a.d.e.store.jdbc.JdbcRecordReader - Ignoring column that is
unsupported. org.apache.drill.common.exceptions.UserException:
UNSUPPORTED_OPERATION ERROR: A column you queried has a data type that
is not currently supported by the JDBC storage plugin. The column's
name was geom_multipolygon and its JDBC data type was OTHER
I tested creating the Drill storage plugin with postgis-jdbc-2.2.1.jar and postgresql-42.1.4.jar but the same error is shown.
If I use:
cast(geom_multipolygon as varchar(255))
it shows the varchar representation of the geometry, another option is getting the MULTIPOLYGON text and transform to Drill binary using ST_GeomFromText(geom), but we need the binary format directly from PostGIS, so those approaches can't be done.
We have seen this page: https://github.com/k255/drill-gis/issues/1 but the proposed solution doesn't work for us, so I think there is a way to achieve this.
UPDATE: I finally found the way that Drill can show the geo fields, is to change the data type in PostGIS from geometry to bytea. It seems to be a compatibility issue. With this way, we can perform geospatial queries on Drill, but in PostGIS those fields are no longer geometries, so they can not be indexed and treated as such.

How to get all points along a way from (osm)PostGIS?

I have import OpenstreetMap data into Postgres with gis extension with tool
osm2pgsql (-s option)
of course, I have the following tables
planet_osm_point
planet_osm_ways
....
Within planet_osm_ways I have a column called way, type geometry(LineString, 4326), content like following
"0102000020E6100000070000005E70BCF1A49F2540D3D226987B134840896764EB749F25403B5DCC858013484040D1860D609F2540C426327381134840CE50DCF1269F2540EF552B137E1348405AAB2CC02D9E2540F978324976134840D66F26A60B9D2540CE8877256E1348403CA81F2FFF9C2540BC1D86FB6D134840"
What is that ? How could I get all points along this way ?
Thanks a lot
That's hex-encoded extended well-known binary (EWKB) of a LINESTRING.
There are several methods to get the points along the way. To get individual coordinates as points, use ST_DumpPoints. Or to simply output the geometry in other human-readable formats (WKT, EWKT, GeoJSON, GML, etc.), see the relevant manual section.

PostGIS geography query returns a string value

I have a strange issue. The lonlat column on my app works well on the development server –– its output is in the form of POINT(X Y). But when I move the data to the production server, the output is strange!
ActionView::Template::Error (undefined method `lon' for "0101000020E6100000541B9C887E7A52C02920ED7F80614440":String):
The lonlat value, which is encoded with SRID: 4326, is being read as a string. I am almost certain that there was a corruption in the data during migrating it from development to production because this was not a problem before the migration.
Does anyone know what about the database schema or column may cause this issue?
A geometry field stores its data as WKB. To see the WKT representation you need to change your query to something like
select ST_Astext(the_geom) as geometry from table
However, I don't know why in your development you have some kind of implicit conversion between WKB binary data and WKT strings. ¿What version of postgres and postgis are you using?
What lang is in your app server?
Is that ActiveRecord you're using?
I suggest you try something like
float ST_X(geometry a_point);
To make sure you can read the data properly and determinate if problem is on the data field or somewhere else.
I also would try doing the pg_dump in a single step if you determinate the problem is with the geometry column.
You can use pg_dump with option
--exclude-table-data=reg_expresion_ _tablename_
--exclude-table-data=schema.reg_expresion_ _tablename_
This will bring all the schema definition, but exclude the table data and bring only the data from table you need.
Turns out that when I killed the connection to the server to migrate the data, Rails did not set the schema search path (meaning didn't discover the postgis extension) upon reconnecting. I had to restart the server to solve this problem.

How inserting LineStrings to a PostGIS-Database with Python, psycopg2 and ppygis

i am trying to insert LineStrings to a local Postgres/PostGIS-database. I use Python 2.7, psycopg2 and ppygis.
Every time when i make a loop-input, only a few records were inserted into the table. I tried to find out the problem with mogrify, but i see no failure.
polyline = []
for row in positions:
lat = row[0]
lon = row[1]
point = ppygis.Point(lon, lat, srid=4326)
polyline.append(point)
linestring = ppygis.LineString(polyline, srid=4326)
self.cursor.execute("BEGIN")
self.cursor.execute("INSERT INTO gtrack_4326 (car, polyline) VALUES (%s,%s);", ("TEST_car", linestring))
self.cursor.execute("COMMIT")
The use of execute.mogrify results in Strings like this:
INSERT INTO gtrack_4326 (car,polyline) VALUES ('TEST_car', '0102000020e610000018000000aefab72638502940140c42d4d899484055a4c2d84250294056a824a1e3994840585b0c795f50294085cda55df1994840edca78a57650294069595249f8994840ec78dd6cbd502940828debdff5994840745314f93f5129407396fecaef994840e1f6bafbd25129404eab329de7994840da588979565229407a562d44e2994840ebc9fca36f522940c2af4797ed9948403bd164b5af5229407a90f9dbf99948407adbf1cb05532940818af4ec039a484062928087585329402834ff9e0e9a4840e8bb5b59a2532940b1ec38341b9a4840dcb28d89de532940afe94141299a484084d3275e0a54294019b1aab9379a484080ca42853454294053509b82469a48408df8043f6054294063844b22569a48406d3e09c7875429406dfbc33b659a4840aa5a77989b542940c20e08196d9a48401b56a7b9cb542940a0a0b9f3699a4840cf2d742502552940192543e9669a484045ac0f351b552940fdb0efd46d9a48406891ed7c3f552940450a0a28799a4840d0189c77525529405f7b6649809a4840');
But if i look into the Database, i see a lot of records without geometry-data in the second column. I did not understand why mogrify shows data in each column and in the DB there is in nearly 50 % of the table no data in the geometry-column.
First, psycopg2 does its own transaction management, so you should generally write:
self.cursor.execute(
"INSERT INTO gtrack_4326 (car, polyline) VALUES (%s,%s);",
("TEST_car", linestring)
)
self.conn.commit()
See the psycopg2 docs.
Second, consider loading data in batches with COPY. See COPY in the psycopg2 docs.
Also consider setting log_statement = 'all' in postgresql.conf and a suitable log_line_prefix then restarting the PostgreSQL server. Examine the logs and see if you can tell what's doing the bogus inserts.
If practical, add a CHECK and/or NOT NULL constraint to the geometry column so that any incorrect INSERTs will fail and report an error to the program doing the insert. This might help you diagnose the problem.
How did you determine that 50% of the rows have no geometry data? I'll warn anyone using clients like pgAdminIII show a blank cell if it contains too much data, so it appears to be NULL, when it isn't. You can also directly view the GIS data in a program like Quantum GIS.
With an SQL client, a better diagnostic to determine if a linestring is really there is to get the number of points in the linestring:
SELECT car, ST_NumPoints(polyline) FROM gtrack_4326;
If the numbers are empty, then your assessment is correct that they are empty. Otherwise, the data are too large to display in your client application.