JSONB PostgreSQL with jOOQ 3.10 - postgresql

How can I write String variable to PostgreSQL JSONB column without generated classes using jOOQ 3.10?
dsl.insertInto(table, Arrays.asList(
DSL.field("configuration")
))
.values(
data.getConfiguration()
).execute();
I have a json string into data.getConfiguration(), but I get exception
org.postgresql.util.PSQLException: ERROR: column "configuration" is of type jsonb but expression is of type character varying

The answer is the same as for your previous question. Write a data type binding (or better: upgrade your jOOQ version!).
DSL.field(name("jsonb_column"),
SQLDataType.VARCHAR.asConvertedDataType(new MyJSONBBinding()));
The manual link on the previous answer I've given shows how to do exactly that.

Related

Pentaho select UUID type fild as UUID and Insert into other table as UUID

I am new to Pentaho, my research end up with nothing
I have a table that has filed with type UUID
I need to copy it to another table where the field has the same UUID.
Table input. has this SQL
I am casting field to UUI filed specifically.
however when I check input fields on the output table it shows as string, how do I cast it to uuid.
Also the same for the output table, it was created with datatype UUID for that field
output table shows datatype as the string
On the run, it throws an error clearly showing that it cannot convert it to UUID
2022/02/11 10:42:45 - Table output.0 - Caused by: org.postgresql.util.PSQLException: ERROR: column "qr_uuid" is of type uuid but expression is of type character varying
I am using Postgres 13 and Pentaho 9
I have found an old thread dealing with the same problem, and there are some hints for the answer. You'll need to edit the database connection for postgres, and in the Options menu add Parameter stringtype with Value unspecified:
How insert UUID values in PostgreSQL table via Kettle?

Pyspark's write stringtype argument doesn't deal with null values

I am trying to write a dataset's data into postgres db using jdbc driver.
my_df.write().format('jdbc').mode('append')\
.option('driver', 'org.postgresql.Driver')\
.option('url', 'my_url')\
.option('dbtable', 'my_dbtable')\
.option('user', 'my_user').save()
Apparently pyspark tries to insert all textual types (i.e. uuid) as text by default and throws that error:
Caused by: org.postgresql.util.PSQLException: ERROR: column "id" is of type uuid but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
In order to overcome that issue I had to set a property:
'stringtype':"unspecified"
But that solution does not work on NULL values and throws that error
Caused by: org.postgresql.util.PSQLException: ERROR: column "id" is of type uuid but expression is of type character
Which basically means that it tries to insert the NULL value as character. Separating the dataset to 2 datasets (as #Karuhanga suggested here Pyspark nullable uuid type uuid but expression is of type character varying) is not possible in my case. Did anyone face that issue and found a solution that does not fix a specific column?
Instead of putting a Null value in uuid columns, use as uuid by default like this:
uuid='00000000-0000-0000-0000-000000000000'

Can JOOQ alias a Liquibase JSONB data type for H2/Postgresql

I'm using this H2 feature to create an alias for JSONB in the jdbc string:
spring.datasource.url: jdbc:h2:mem:testdb;INIT=create domain if not exists jsonb as text;MODE=PostgreSQL
But JOOQs' codegen liquibase support (generator pointed at liquibase files) doesn't recognize the JSONB column type.
and I keep getting:
Reason: liquibase.exception.DatabaseException: Unknown data type: "JSONB";
Is there a way to tell the generator to alias this data type to TEXT?
You can try to init your h2 with CREATE TYPE "JSONB" AS json

Npgsql.PostgresException: Column cannot be cast automatically to type bytea

Using EF-Core for PostgresSQL, I have an entity with a field of type byte but decided to change it to type byte[]. But when I do migrations, on applying the migration file generated, it threw the following exception:
Npgsql.PostgresException (0x80004005): 42804: column "Logo" cannot be
cast automatically to type bytea
I have searched the internet for a solution but all I saw were similar problems with other datatypes and not byte array. Please help.
The error says exactly what is happening... In some cases PostgreSQL allows for column type changes (e.g. int -> bigint), but in many cases where such a change is non-trivial or potentially destructive, it refuses to do so automatically. In this specific case, this happens because Npgsql maps your CLR byte field as PostgreSQL smallint (a 2-byte field), since PostgreSQL lacks a 1-byte data field. So PostgreSQL refuses to cast from smallint to bytea, which makes sense.
However, you can still do a migration by writing the data conversion yourself, from smallint to bytea. To do so, edit the generated migration, find the ALTER COLUMN ... ALTER TYPE statement and add a USING clause. As the PostgreSQL docs say, this allows you to provide the new value for the column based on the existing column (or even other columns). Specifically for converting an int (or smallint) to a bytea, use the following:
ALTER TABLE tab ALTER COLUMN col TYPE BYTEA USING set_bytea(E'0', 0, col);
If your existing column happens to contain more than a single byte (should not be an issue for you), it should get truncated. Obviously test the data coming out of this carefully.

Explicit jsonb type cast in Squeryl

I'm using Squeryl 0.9.5-7 and Postgres 9.4 with jsonb datatype and want to insert some data:
case class Log(id: String, meta: String) //meta will contain json
val logs = table[Log]
logs.insert(Log(randomId, "{\"Hi\": \"I'm a json!\"}"))
But got a typecast error that says "Column meta has jsonb type but expression has character varying type. Rewrite expression or convert it's type."
How can I explicitly cast my String field into jsonb so that raw sql-parameter will look like ?::jsonb?
And then, it's interesting how to write json-queries such as #> or ->> with Squeryl?
I found a better way, that doesn't require to create a CAST in the database.
If you're using squeryl 0.9.6, you can add an explicit cast in your schema which tells squeryl to explicitly cast your string into a jsonb.
on(logs)(s => declare(
s.meta is (dbType("jsonb").explicitCast) // enables explicit casting into jsonb
))
With Squeryl 0.9.6 you can register support for your own custom types. Here's an example. For non standard operators, take a look at custom functions
In my experience with postgres 9.4, reading works fine as a Squeryl String but inserting fails with:
ERROR: column "****" is of type json but expression is of type
character varying [error] Hint: You will need to rewrite or cast the
expression. [error] Position: 166 [error] errorCode: 0, sqlState:
42804
So, the solution I found is to create a 'AS ASSIGNMENT' cast in my postgres database and that does the trick:
CREATE CAST(VARCHAR AS JSON)
WITH INOUT
AS ASSIGNMENT