Converting lat-long to PostGIS geometry without querying the database - postgresql

I have a table in postgresql with a PostGIS geometry(point, 4326) column (location, using SRID 4326) and I have a Python application that using SQL Alchemy updates the table (the rest of the columns) without any problem.
Now, I need to update the location column and I know I can use the proper text representation of a given location to update the column using SQL Alchemy without the need to use GEOAlchemy, for instance I can update the column with the value: '0101000020E6100000AEAC7EB61F835DC0241CC418A2F74040'
which corresponds to lat:33.9346343 long:-118.0488106
The question is: is there a way to compute in Python this '0101000020E6100000AEAC7EB61F835DC0241CC418A2F74040' having this (33.9346343 ,-118.0488106) as an input without querying the database? or any way to update the column using a proper text input?
I know I can use SQLAlchemy to execute this query:
select st_setsrid(st_makepoint(-118.0488106, 33.9346343),4326)
and obtain the value to update the column, but I want to avoid that.
Thanks in advance!

The solution to this problem is rather easier than it seems. To update the field using text and the input lat-long all I needed to do was defining the SRID in the text assign:
location = 'SRID=4326;POINT(-118.0488106 33.9346343)'
This will update the geometry(point,4326) column properly and when you do a select in the table the value of the column is the expected one:
"0101000020E6100000AEAC7EB61F835DC0241CC418A2F74040"
Thanks guys!

Related

How can I add SRID 4326 (Spatial Types) to Workbench when adding columns?

When I add a column with type POINT in the EER Diagram, is there anything I can do with that diagram so when I generate automatically the scripts, SRID 4326 is attached to CREATE TABLE script? If I don't setup that number, then by default is zero (flat), but I do need 4326 (sphere).
If not possible, does that mean I cannot synchronise my model with my server automatically and I have to add these changes manually all the time?
I couldn't figure this out either. I believe adding a SRID to a column is currently not supported by MySQL Workbench.
To check that it is indeed not supported, I did the following:
Added a SRID to a column of an existing DB
Reversed engineered a script of this DB (using Workbench)
Checked the script if it included the set SRID for the column
Was disappointed that it didn't...
The "good" news is though, that as it is not supported, MySQL Workbench won't pick up on a missing SRID on a column when synchronizing sources.
This means that once you set the SRID on a column yourself, it won't cause any problems when synchronizing in the futures.
Note that in order to set a SRID on a column, there can't be a (spatial) index on that column. Therefore, you must remove the index, set the SRID and then add the index back.
Below is a short and simple script, which I used to do this. Don't forget to update it to your use-case:
DROP INDEX `my_idx` ON my_table;
ALTER TABLE my_table MODIFY COLUMN my_column POINT NOT NULL SRID 4326;
ALTER TABLE my_table ADD SPATIAL INDEX `my_idx` (`my_column`) VISIBLE;

On Google Data Studio, using PostgreSQL data, how do I "SELECT * ..." but for camelCase columns?

On Google Data Studio, I cannot create a chart from Postgres data if table columns are in camelCase. I have data in PostgreSQL where I want to get charts from. Integrating it as a data source works fine. Now, I have a problem when creating a chart.
After creating a chart and selecting a data source, I try to add a column, which results in this error:
Error with SQL statement: ERROR: column "columnname" does not exist Hint: Perhaps you meant to reference the column "table.columnName". Position: 8
It just so happens that all my columns are in camelCase. Is there no way around this? Surely this is a basic question that has been resolved.
When connecting to your data source, try using 'Custom query' instead of selecting a table from your database. Then manually write your SQL query where you cast your camel case column names to lower case using sql alias. Worked for me.
example:
SELECT
"camelCaseColA" as cola,
"camelCaseColB" as colb,
"camelCaseColC" as colc
FROM
tableName as table

"Attributes specified for column are incompatible with existing column definition"

It's been a while.
Using DB2 10 for z/OS, I've been asked to change a specific column in a table from decimal(7,2) to decimal(7,4). Sounds easy, right?
alter table MySchema.MyTable
alter column myColumn
set data type decimal(7,4);
But, DB2 responds with this error: "Attributes specified for column 'MYCOLUMN' are incompatible with existing column definition."
I had thought that converting from decimal(7,2) to decimal(7,4) would be pretty straightforward, but DB2 disagrees.
Outside of dropping the table and recreating it from scratch, what alternatives do I have?
Thanks in advance!
Dave
The reason Db2 doesn't like that change is you're going from from 99999.99 to 999.9999
Is that really what you want? Going from (7,2) to (9,4) would just add two more decimal places without losing any data and should be allowed by the Db.
Db2 for i gives a warning, but allows you to ignore the warning...
Create a new column ALTER ADD COLUMN of the right type, use an UPDATE to populate it, ALTER DROP COLUMN the old column. RENAME COLUMN so set the name of the original column.

how to update JSONB column using knexjs, bookshelfjs

I have a JSONB column in PostgreSQL database like {lat: value, lon: value}. I want to change any specific value at a time eg. lat, but I am not sure how I can achieve this using bookshelf.js or knex.js. I tried using jsonb_set() method specified in Postgres documentation but I am not sure if I used that correctly. Can somebody please suggest me how can I do this? or what is the correct syntax to do this? Thanks.
AFAIK only knex based thing that supports writing to and extracting data from postgresql jsonb columns is objection.js ORM.
With plain knex you need to use raw to write references:
knex('table').update({
jsonbColumn: knex.raw(`jsonb_set(??, '{lat}', ?)`, ['jsonbColumn', newLatValue])
})
You can check generated SQL here https://runkit.com/embed/44ifdhzxejf1
Originally answered in: https://github.com/tgriesser/knex/issues/2264
More examples how to use jsonb_set with knex can be found in following answers
How to update a jsonb column's field in PostgreSQL?
What is the best way to use PostgreSQL JSON types with NodeJS
Jsonb field update using knex.js
return knex("tablename").update({
jsonbkey: knex.raw(`
jsonb_set(jsonbkey, '{city}','"Ayodhya"')
`)
}).where({"id" :2020})
The jsonbkey will be the column name, where the datatype is jsonb.
The tablename is the name of your table.
The city is the object key.
If there is multiple level of object then you can use dot. Like '{city.id}'
let result = await db().raw(`UPDATE widget
SET name = ?,
jsonCol= jsonCol::jsonb || ?::jsonb
WHERE id = ?`,
[name, JSON.stringify(newJsonData), id);
this knex query helps to update any json column by overriding specific keys in the value supplied to the right hand side of || operator. DO NOT forget to typecast the values with ::jsonb

How to add geometry column using pgAdmin

I'm using a database created in the PostgreSQL. In its schema there are two tables and in one of them I want to add a geometry column.
The problem is that I created the postgis Extension (CREATE EXTENSION postgis;) for the database, but I'm not able to add this data type (geometry) column using pgAdmin.
To do this with pgAdmin's "New Column..." dialog, if you can't find geometry, then you might be able to find public.geometry instead (if PostGIS was installed there, which is normal).
However, I advise against using pgAdmin for creating geometry columns, as it does not understand typmods used to define the geometry type and SRID.
The best way is using DDL to directly manipulate the table, e.g.:
ALTER TABLE locations ADD COLUMN geom geometry(PointZ,4326);
to add a geom column of XYZ points (long, lat, alt).