How to use Kotlin's Ktorm to perform WHERE clause operation on custom Postgres "object type" avoiding "PSQLException: ERROR: operator does not exist" - postgresql

Whilst performing a WHERE clause operation on a custom Postgres "object type" I ended up the following PSQLException.
Language: Kotlin.
ORM Library: Ktorm ORM.
Exception
org.postgresql.util.PSQLException: ERROR: operator does not exist: rate = character varying
Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
I have followed the Ktorm official guides here but, there is no mention of custom Postgres types. Any pointers/help would be highly appreciated. See code below to reproduce:
Thank you.
Example test that would produce the above exception
internal class SuppliersInstanceDAOTest {
#Test
fun shouldReturnInstanceSequence() {
val database = Database.connect("jdbc:postgresql://localhost:5432/mydb", user = "postgres", password = "superpassword")
val instanceDate: LocalDate = LocalDate.of(2019, 4, 1)
database.withSchemaTransaction("suppliers") {
database.from(SuppliersInstanceTable)
.select(SuppliersInstanceTable.instanceSeq)
.whereWithConditions {
// The following line causes "ERROR: operator does not exist: rate = character varying"
it += SuppliersInstanceTable.rate eq Rate.DAILY
}.asIterable()
.first()
.getInt(1)
}
}
}
Schema
-- Note the special custom enum object type here that I cannot do anything about
CREATE TYPE suppliers.rate AS ENUM
('Daily', 'Byweekly');
CREATE TABLE suppliers.instance
(
rate suppliers.rate NOT NULL,
instance_value integer NOT NULL
)
TABLESPACE pg_default;
Kotlin's Ktorms Entities and bindings
enum class Rate(val value: String) {
DAILY("Daily"),
BIWEEKLY("Byweekly")
}
interface SuppliersInstance : Entity<SuppliersInstance> {
companion object : Entity.Factory<SuppliersInstance>()
val rate: Rate
val instanceSeq: Int
}
object SuppliersInstanceTable : Table<SuppliersInstance>("instance") {
val rate = enum("rate", typeRef<Rate>()).primaryKey().bindTo { it.rate } // <-- Suspect
//val rate = enum<Rate>("rate", typeRef()).primaryKey().bindTo { it.rate } // Failed too
val instanceSeq = int("instance_value").primaryKey().bindTo { it.instanceSeq }
}

After seeking help from the maintainers of Ktorm, it turns out there is support in the newer version of ktorm for native Postgressql enum object types. In my case I needed pgEnum instead of the default ktrom enum function that converts the enums to varchar causing the clash in types in Postressql:
Reference here for pgEnum
However, note at the time of writing ktorm's pgEnum function is only in ktorm v3.2.x +. The latest version available in Jcentral and mavenCentral maven repositories is v3.1.0. This is because there is also a group name change from me.liuwj.ktorm to org.ktorm in the latest version. So upgrading would also mean changing the group name in your dependencies for the new group name to match the maven repos. This was a seamless upgrade for my project and the new pgEnum worked in my use case.
For my code examples above, this would mean swapping this
object SuppliersInstanceTable : Table<SuppliersInstance>("instance") {
val rate = enum("rate", typeRef<Rate>()).primaryKey().bindTo { it.rate } <---
...
}
For
object SuppliersInstanceTable : Table<SuppliersInstance>("instance") {
val rate = pgEnum<Rate>("rate").primaryKey().bindTo { it.rate } <---
...
}

Related

The trait `FromSql<'_>` is not implemented for `Uuid` in tokio-postgres in rust

I am trying to use UUID as my primary key in Postgres.
I am getting the trait FromSql<'_> is not implemented for Uuid in tokio-postgres.
First I try to use tokio-pg-mapper but it is also showing the same compile error.
So, I try diff approach and try to implement the From on Struct to covert it directly from Row.
When I try to implements From to convert the Row type to my struct Shop.
impl From<Row> for Shop {
fn from(row: Row) -> Self {
Self {
id: row.get("id"), // Id is UUID type
name: row.get("name"),
address: row.get("address")
}
}
}
Still getting the the trait FromSql<'_> is not implemented for Uuid in tokio-postgres
I know that it is asking me to implement the FromSql for the UUID type.
But I looked into the tokio-postgres docs and found that it is already implemented there.
Did I miss something ?
uuid=0.8
tokio-postgres=0.7.2
To activate the UUID support I need to declare it in my Cargo.toml file and it will start working with tokio-pg-mapper and with my custom solution too.
tokio-postgres = {version="0.7.2", features=["with-uuid-0_8"]}

AWS-CDK Appsync Codefirst input types

To avoid duplication of data structures I wanted to reuse a type definition on an input type like this
export const DeviceStatus = new ObjectType('DeviceStatus', {
definition: {
time: timestamp,
firmwareVersion: string
},
});
export const DeviceStatusInput = new InputType('DeviceStatusInput', {
definition: {
tenantId: id_required,
deviceId: id_required,
// Reuse of DeviceStatus Field definition
status: DeviceStatus.attribute()
}
});
There is no error since the return type of DeviceStatus.attribute() is fine, and this works for ObjectType inheritance.
From my perspective this should work, but deploying results in a nasty "Internal Error creating Schema" error.
Of course I could move the whole definition into an object and reuse it but that seems weird. Is there any good solution on this for the CodeFirst approach
It seem to be invalid to reference object type in input type.
I recommend to view Can you make a graphql type both an input and output type?
Probably best you can do is to create some convenience method which will create you both object and input type from single definition.

building a function to add checks to amazon deequ framework

Using amazon deequ library I'm trying to build a function that takes 3 parameters, the check object, a string telling what constraint needs to be run and another string that provides the constraint criteria. I have a bunch of checks that I want to read from a mysql table. My intention is to iterate through all the checks that I get from the mysql table and build a check object using the function I described above and run the checks on a source dataframe
Here a example of the amazon deequ
https://towardsdatascience.com/automated-data-quality-testing-at-scale-using-apache-spark-93bb1e2c5cd0
So the function call looks something like this,
var _check = build_check_object_function(check_object, "hasSize", "10000")
This function should add a new hasSize check to the check_object and return that.
The part where I'm stuck is how to translate the hasSize string to the hasSize function.
var _check = Check(CheckLevel.Error, "Data Validation Check")
val listOfFunctions= _check.getClass.getMethods.filter(!_.getName().contains('$'))
for (function <- listOfFunctions) {
if( function.getName().toLowerCase().contains(row(2).asInstanceOf[String].toLowerCase())) {
_check = _check.function(row(3))
}else{
println("Not a match")}
}
Here is the error that I'm getting
<console>:38: error: value function is not a member of com.amazon.deequ.checks.Check
if( function.getName().toLowerCase().contains(row(2).asInstanceOf[String].toLowerCase())) {_check = _check.function(row(3))
You can either use runtime reflection or build a thin translation layer between your database and the deequ declarations.
I would suggest you go with translating database constraint/check strings explicitly to deequ declarations, e.g.:
if (constraint == "hasSize") {
// as Constraint
Constraint.sizeConstraint(_ <= 10)
// as Check
Check(CheckLevel.Error, "name").hasSize(_ <= 10)
}

Slick insert not working while trying to return inserted row

My goal here is to retrieve the Board entity upon insert. If the entity exists then I just want to return the existing object (which coincides with the argument of the add method). Otherwise I'd like to return the new row inserted in the database.
I am using Play 2.7 with Slick 3.2 and MySQL 5.7.
The implementation is based on this answer which is more than insightful.
Also from Essential Slick
exec(messages returning messages +=
Message("Dave", "So... what do we do now?"))
DAO code
#Singleton
class SlickDao #Inject()(db: Database,implicit val playDefaultContext: ExecutionContext) extends MyDao {
override def add(board: Board): Future[Board] = {
val insert = Boards
.filter(b => b.id === board.id && ).exists.result.flatMap { exists =>
if (!exists) Boards returning Boards += board
else DBIO.successful(board) // no-op - return specified board
}.transactionally
db.run(insert)
}
EDIT: also tried replacing the += part with
Boards returning Boards.map(_.id) into { (b, boardId) => sb.copy(id = boardId) } += board
and this does not work either
The table definition is the following:
object Board {
val Boards: TableQuery[BoardTable] = TableQuery[BoardTable]
class BoardTable(tag: Tag) extends Table[BoardRow](tag, "BOARDS") {
// columns
def id = column[String]("ID", O.Length(128))
def x = column[String]("X")
def y = column[Option[Int]]("Y")
// foreign key definitions
.....
// primary key definitions
def pk = primaryKey("PK_BOARDS", (id,y))
// default projection
def * = (boardId, x, y).mapTo[BoardRow]
}
}
I would expect that there would e a new row in the table but although the exists query gets executed
select exists(select `ID`, `X`, `Y`
from `BOARDS`
where ((`ID` = '92f10c23-2087-409a-9c4f-eb2d4d6c841f'));
and the result is false there is no insert.
There is neither any logging in the database that any insert statements are received (I am referring to the general_log file)
So first of all the problem for the query execution was a mishandling of the futures that the DAO produced. I was assigning the insert statement to a future but this future was never submitted to an execution context. My bad even more so that I did not mention it in the description of the problem.
But when this was actually fixed I could see the actual error in the logs of my application. The stack trace was the following:
slick.SlickException: This DBMS allows only a single column to be returned from an INSERT, and that column must be an AutoInc column.
at slick.jdbc.JdbcStatementBuilderComponent$JdbcCompiledInsert.buildReturnColumns(JdbcStatementBuilderComponent.scala:67)
at slick.jdbc.JdbcActionComponent$ReturningInsertActionComposerImpl.x$17$lzycompute(JdbcActionComponent.scala:659)
at slick.jdbc.JdbcActionComponent$ReturningInsertActionComposerImpl.x$17(JdbcActionComponent.scala:659)
at slick.jdbc.JdbcActionComponent$ReturningInsertActionComposerImpl.keyColumns$lzycompute(JdbcActionComponent.scala:659)
at slick.jdbc.JdbcActionComponent$ReturningInsertActionComposerImpl.keyColumns(JdbcActionComponent.scala:659)
So this is a MySQL thing in its core. I had to redesign my schema in order to make this retrieval after insert possible. This redesign includes an introduction of a dedicated primary key (completely unrelated to the business logic) which is also an AutoInc column as the stack trace prescribes.
In the end the solution becomes too involved and instead decided to use the actual argument of the add method to return if the insert was actually successful. So the implementation of the add method ended up being something like this
override def add(board: Board): Future[Board] = {
db.run(Boards.insertOrUpdate(board).map(_ => board))
}
while there was some appropriate Future error handling in the controller which was invoking the underlying repo.
If you're lucky enough and not using MySQL with Slick I suppose you might have been able to do this without a dedicated AutoInc primary key. If not then I suppose this is a one way road.

Allowing nulls on a record type F#

I'm trying to make a type extension for MongoDB C# Driver that will return an option type instead of null when trying to execute a query that yields 0 results.
I've run into a few issues on the way, but right now there is just one thing in the way.
Here is the code
[<Extension>]
type Utils () =
[<Extension>]
static member inline tryFindOne(x: MongoCollection<'T>, query) =
match x.FindOne(query) with
| null -> None
| value -> Some value
[<CLIMutable>]
type Entity =
{ Id : ObjectId; Name : string }
static member Create(name) =
{ Id = ObjectId(); Name = name }
The problem of course is that the record type Entity to the F# compiler does not conform to the type constraint of the extension method('T : null) but I have to have the contraint to be able to pattern match against nulls. Of course it's sort of a nonsensical thing because the type Entity is very much "nullable" for interop purposes and will be returned as null whenever you try to query a MongoDB collection which yields 0 results. I tried to set the attribute [<AllowNullLiteral>] but unfortunately it only works with classes. So alas I'm stuck, I could make Entity into a class instead but I think records are more idiomatic F#.
I think the following should work:
[<Extension>]
type Utils () =
[<Extension>]
static member inline tryFindOne(x: MongoCollection<'T>, query) =
let theOne = x.FindOne(query);
if (box theOne = null) None else Some(theOne)
I have borrowed the idea from Sergey Tihon's post here:
https://sergeytihon.wordpress.com/2013/04/10/f-null-trick/
Alternative way to do the null check without boxing:
[<Extension>]
type Utils () =
[<Extension>]
static member inline tryFindOne(x: MongoCollection<Entity>, query) =
let theOne = x.FindOne(query)
if obj.ReferenceEquals(theOne, null) then None else Some(theOne)