R2DBC and enum (PostgreSQL) - postgresql

Update 15/08/2020: Looks like Enum support was added on Jun 16. R2DBC commit.
Does H2DBC support PostgreSQL enums? I checked they git page but it doesn't mention anything about it. If it does, how enums could be used (INSERT, SELECT)?
Lets say PostgreSQL enum
CREATE TYPE mood AS ENUM ('UNKNOWN', 'HAPPY', 'SAD', ...);
Java class
#Data
public class Person {
private String name;
private Mood mood;
// ...
enum Mood{ UNKNOWN, HAPPY, SAD, ...}
}
I tried:
// insert
var person = ...;
client.insert()
.table("people")
.using(person)
.then()
.subscribe(System.out::println);
// select
var query = "SELECT * FROM people";
client.execute(query)
.as(Person.class)
.fetch().all()
.subscribe(System.out::println);
But I'm getting error messages:
# on insert
WARN [reactor-tcp-epoll-1] (Loggers.java:294) - Error: SEVERITY_LOCALIZED=ERROR, SEVERITY_NON_LOCALIZED=ERROR, CODE=42804, MESSAGE=column "mood" is of type mood but expression is of type character varying, HINT=You will need to rewrite or cast the expression., POSITION=61, FILE=parse_target.c, LINE=591, ROUTINE=transformAssignedExpr
# on select
ERROR [reactor-tcp-epoll-1] (Loggers.java:319) - [id: 0x8581acdb, L:/127.0.0.1:39726 ! R:127.0.0.1/127.0.0.1:5432] Error was received while reading the incoming data. The connection will be closed.
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.data.mapping.MappingException: Could not read property private ...
I found similar post but without luck to solve my problem.. maybe I was applying it wrong..
Any help or tips are welcome.

Tested with org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE and io.r2dbc:r2dbc-postgresql:0.8.1.RELEASE.
Kotlin version.
Define a enum class
enum class Mood {
UNKNOWN,
HAPPY,
SAD
}
Create a custom codec
class MoodCodec(private val allocator: ByteBufAllocator) : Codec<Mood> {
override fun canEncodeNull(type: Class<*>): Boolean = false
override fun canEncode(value: Any): Boolean = value is Mood
override fun encode(value: Any): Parameter {
return Parameter(Format.FORMAT_TEXT, oid) {
ByteBufUtils.encode(allocator, (value as Mood).name)
}
}
override fun canDecode(dataType: Int, format: Format, type: Class<*>): Boolean = dataType == oid
override fun decode(buffer: ByteBuf?, dataType: Int, format: Format, type: Class<out Mood>): Mood? {
buffer ?: return null
return Mood.valueOf(ByteBufUtils.decode(buffer))
}
override fun type(): Class<*> = Mood::class.java
override fun encodeNull(): Parameter =
Parameter(Format.FORMAT_TEXT, oid, Parameter.NULL_VALUE)
companion object {
// Get form `select oid from pg_type where typname = 'mood'`
private const val oid = YOUR_ENUM_OID
}
}
Registe the codec
You may need change runtimeOnly("io.r2dbc:r2dbc-postgresql") to implementation("io.r2dbc:r2dbc-postgresql")
#Configuration
#EnableR2dbcRepositories
class AppConfig : AbstractR2dbcConfiguration() {
override fun connectionFactory(): ConnectionFactory = PostgresqlConnectionConfiguration.builder()
.port(5432) // Add your config here.
.codecRegistrar { _, allocator, registry ->
registry.addFirst(MoodCodec(allocator))
Mono.empty()
}.build()
.let { PostgresqlConnectionFactory(it) }
}

I used the below for Spring boot 2.6.4 + r2dbc-postgresql 0.8.11 by adding a customizer rather than creating the connection factory myself.
Thanks #Hantsy for pointing EnumCodec out. I added it to a customizer therefore it can play nicely with existing autoconfigure procedure. Also, the spring-data keeps converting my enum to string until I added the converter.
Hopefully these can provide a little help to others.
Register EnumCodec to builder customizer as extensions
It is possible to register multiple enum, just repeat the withEnum() call.
/**
* Use the customizer to add EnumCodec to R2DBC
*/
#Bean
public ConnectionFactoryOptionsBuilderCustomizer connectionFactoryOptionsBuilderCustomizer() {
return builder -> {
builder.option(Option.valueOf("extensions"),
List.of(EnumCodec.builder()
.withEnum("enum_foo", FooEnum.class)
.withRegistrationPriority(RegistrationPriority.FIRST)
.build()));
logger.info("Adding enum to R2DBC postgresql extensions: {}", builder);
};
}
Implement spring data converter by extending EnumWriteSupport
public class FooWritingConverter extends EnumWriteSupport<Foo> {
}
Register converters so that spring data won't always convert enum to string.
This step is a slightly enhanced version of R2dbcDataAutoConfiguration in spring-boot-autoconfigure project.
/**
* Register converter to make sure Spring data treat enum correctly
*/
#Bean
public R2dbcCustomConversions r2dbcCustomConversions(DatabaseClient databaseClient) {
logger.info("Apply R2DBC custom conversions");
R2dbcDialect dialect = DialectResolver.getDialect(databaseClient.getConnectionFactory());
List<Object> converters = new ArrayList<>(dialect.getConverters());
converters.addAll(R2dbcCustomConversions.STORE_CONVERTERS);
return new R2dbcCustomConversions(
CustomConversions.StoreConversions.of(dialect.getSimpleTypeHolder(), converters),
List.of(
new FooWritingConverter()
));
}
Step 1 and 3 can be added to your application class or any other valid configuration.

Check my article about Postgres specific features supported in R2dbc.
There are two options.
use custom Postgres enum type and Java enum type, and register EnumCodec in the connection factory builder.
use a textual type as data type(such as varchar), and Java Enum type, Spring data r2dbc will convert them directly.

Related

Typescript - Get uninitialized properties after compilation

I am currently writing a wrapper around socket.io. Comming from a very object-oriented background, I want to implement the concept of Models in my framework/wrapper.
If you happen to know socket.io you might know that you get the data that is associated with an event as a parameter, now I have implemented a custom routing system where the handler of the route gets the data in an express.js like request object.
The idea is to have model classes that look something like this:
class XRequestModel
#v.String({ message: 'The username must be a string!' })
public userName: string;
}
And the route event might look something like this:
#RouteConfig({ route: '/something', model: XRequestModel })
class XEvent extends Route {
public on(req: Request<XRequestModel>, res: Response) {
// Handle Event
}
}
And to complete the example here is how the request object might look like:
class Request<T> {
public data: T;
}
Now generics in typescript are very limited since the type information is removed after compilation, I can not use the generic Request parameter ( which is the type of the model ) to get metadata from the model - Metadata, in this case, is the validation decorator. To overcome this issue I give a reference of the Model class to the RouteConfig of the RouteEvent, which is internally used and would allow me to create instances of the model, get the properties and so on...
The idea here is to give the handler of a route, a request object with pre-validated, typesafe data.
The thing holding me back from this, is the fact that unused properties, get removed after compilation by typescript, So I cannot get the metadata of the model. Initializing the class-property would solve this:
class XRequestModel
#v.String({ message: 'The username must be a string!' })
public userName: string = '';
}
But I think this makes for some very verbose syntax, and I dont want to force the user of this wrapper to init all the model properties.
An implementation side-note:
The user of the framework has to register the classes to a 'main' class and from there I can get the Route-class via decorator reflection.
When I try to get the properties of the model without initialized properties - First model example.
// Here the route.config.model refers to the model from the RouteConfig
Object.getOwnPropertyNames(new route.config.model());
>>> []
Here is what I get with initialized properties:
Object.getOwnPropertyNames(new route.config.model());
>>> [ 'userName' ]
Here a link to the GitHub repository: https://github.com/FetzenRndy/SRocket
Note that models are not implemented in this repo yet.
Basically, my question is: How can I get the properties of a class that has uninitialized properties after compilation.
The problem is that if no initialization happens, no code is emitted for the fields, so at runtime the field does not exist on the object until a value is assigned to it.
The simplest solution would be to initialize all fields even if you do so with just null :
class XRequestModel {
public userName: string = null;
public name: string = null;
}
var keys = Object.getOwnPropertyNames(new XRequestModel())
console.log(keys); // [ 'userName', 'name' ]
If this is not a workable solution for you, you can create a decorator that adds to a static field on the class and the walk up the prototype chain to get all fields:
function Prop(): PropertyDecorator {
return (target: Object, propertyKey: string): void => {
let props: string[]
if (target.hasOwnProperty("__props__")) {
props = (target as any)["__props__"];
} else {
props = (target as any)["__props__"] = [];
}
props.push(propertyKey);
};
}
class XRequestModelBase {
#Prop()
public baseName: string;
}
class XRequestModel extends XRequestModelBase {
#Prop()
public userName: string;
#Prop()
public name: string;
}
function getAllProps(cls: new (...args: any[]) => any) : string[] {
let result: string[] = [];
let prototype = cls.prototype;
while(prototype != null) {
let props: string[] = prototype["__props__"];
if(props){
result.push(...props);
}
prototype = Object.getPrototypeOf(prototype);
}
return result;
}
var keys = getAllProps(XRequestModel);
console.log(keys);

Neo4j 3.0.3 Stored procedures in Scala

Is there any sample Scala code available for creating stored procedures in Neo4j-3.0.3 ?
I have been trying to create one simple Scala based stored procedure. Below is the Error message I get when I copy my scala-jar file to the neo4j-plugins directory and start the neo4j server :
=================
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.impl.proc.Procedures#1ac0223' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:444)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:107)
at org.neo4j.kernel.impl.factory.GraphDatabaseFacadeFactory.newFacade(GraphDatabaseFacadeFactory.java:140)
... 10 more
Caused by: org.neo4j.kernel.api.exceptions.ProcedureException: Unable to find a usable public no-argument constructor in the class `neoscala`. Please add a valid, public constructor, recompile the class and try again.
=================
The scala class that I have used is :
package neoproc
import org.neo4j.graphdb.GraphDatabaseService
import org.neo4j.procedure.Procedure;
import javax.ws.rs.core.{Context, Response}
class neoscala(#Context db: GraphDatabaseService) {
#Procedure
def alice():String = {
String.valueOf(db.execute( "MATCH (n:User) return n" ));
}
}
Your Scala class declares a constructor with a GraphDatabaseService argument, and the exception tells you that it only wants a no-argument constructor.
It's documented in both
the user documentation:
Only static fields and #Context-annotated fields are allowed in Procedure classes.
the Javadoc:
The procedure method itself can contain arbitrary Java code - but in order to work with the underlying graph, it must have access to the graph API. This is done by declaring fields in the procedure class, and annotating them with the Context annotation. Fields declared this way are automatically injected with the requested resource. This is how procedures gain access to APIs to do work with.
All fields in the class containing the procedure declaration must either be static; or it must be public, non-final and annotated with Context.
Apparently it's not possible to create a class with a public field in Scala, so you'll have to create a parent Java class with the public field, and extend it with your Scala class:
// ProcedureAdapter.java
public abstract class ScalaProcedureAdapter {
#Context
public GraphDatabaseService db;
}
// neoscala.scala
class neoscala extends ScalaProcedureAdapter {
// ...
}
Here is the solution for this :
We will create Class in scala :
class FullTextIndex extends JavaHelper {
#Procedure("example.search")
#PerformsWrites
def search(#Name("label") label: String,
#Name("query") query: String): Stream[SearchHit] = {
//declare your method
}
val nodes: Stream[Node] = db.index.forNodes(index).query(query).stream
val newFunction: java.util.function.Function[Node, SearchHit] = (node: Node) => new SearchHit(node)
nodes.map {
newFunction
}
}
private def indexName(label: String): String = {
"label-" + label
}
}
Procedure in Neo4j always return result in Stream and it is a latest feature in Java8 so we will also used Java Class for return the final result and For defining the public variable.
We will create Java class for result :
public class JavaHelper {
#Context
public GraphDatabaseService db;
#Context
public Log log;
public static class SearchHit {
//your result code here
}
You can refer knoldus blog for Neo4j User Defined Procedure for creating and storing Neo4j Procedure with Scala. Here you will also find sample code with git hub repository.

Automatically wrapping/converting JavaBeans into case classes

We are using Kryo to communicate between a Scala application and a Java application. Since the class definitions have to be used from Java (and we don't want to include the Scala library as a dependency in the Java applicaton) we are using JavaBeans to define the transfer objects.
However, using JavaBeans directly in Scala is a bit of a hassle. No pattern matching, having to use new, etc. What we're doing right now is defining extractors and apply methods in separate objects on the Scala side to make it nicer to work with these classes.
Since most of what we need is boilerplate, we are wondering if there would be a way of doing this automatically. For example, we have this JavaBean (there are about 20+ different message types):
public class HandshakeRequest extends Request {
private String gatewayId;
private String requestId;
private Date timestamp = new Date();
public HandshakeRequest(String gatewayId, String requestId) {
this.gatewayId = gatewayId;
this.requestId = requestId;
}
public String getGatewayId() { return gatewayId; }
public String getRequestId() { return requestId; }
public Date getTimestamp() { return timestamp; }
private HandshakeRequest() { /* For Kryo */ }
}
This is an example of the object we use to bridge to Scala:
object Handshake {
def unapply(msg: HandshakeRequest): Option[ (DateTime, String, String) ] = {
(
new DateTime(msg.getTimestamp.getTime),
msg.getRequestId,
msg.getGatewayId
).some
}
def apply(gatewayId: String, requestId: String) = new HandshakeRequest(gatewayId, requestId)
}
Since all of our object have the Timestamp, it is also part of the boilerplate. We'd like some way (perhaps a macro?) to automatically generate the unapply and apply methods (and ideally, the whole object itself).
Does anyone know of an easy way to accomplish this?
Since there were no answers, I came up with this: https://github.com/yetu/scala-beanutils.
I've started a project https://github.com/limansky/beanpuree which allows convertions from beans to case classes. I'm going to add some more features, like automatic type convertion between Java number classes and Scala classes.

How to support embedded maps (with custom value types) in MongoDB GORM?

I would like to have an embedded document referred to by a map (as in 'class A' below). The environment is Grails + GORM + MongoDB.
is that possible, and if yes, how?
class A { // fails with IllegalArgumentException occurred when processing request: can't serialize class X in line 234 of org.bson.BasicBSONEncoder
static mapWith = "mongo"
Map<String, X> map = new HashMap<String, X>()
}
class B { // works
static mapWith = "mongo"
List<X> list = new ArrayList<X>()
}
class C { // works with primitive type values
static mapWith = "mongo"
Map<String, String> map = new HashMap<String, String>()
}
class X {
String data
public X(String data) {
this.data = data
}
}
The embedding works perfectly,as Art Hanzel advised.
However your problem comes from the fact that you try and use List genericity as a sort of constraint :
Map<String, X>
The problem is that Grails couldn't cope well with this syntax, first because Groovy doesn't support genericity.
However, the MongoDB plugin offers a very powerful functionality that lets you define custom type as Domain Class Properties : see here.
In your case you could have
class A {
static mapWith = "mongo"
MyClass map = new MyClass()
}
Then in your src/java for example you could for example implement a
class MyClass extends HashMap<String,X> { }
Then, of course, you have to define a special AbstractMappingAwareCustomTypeMarshaller to specify how to read and write the property in the DB.
An additional step could also be to add a custom validator to class A to check the validity of data...
The MongoDB Grails plugin documentation describes how to make embedded documents:
class Foo {
Address address
List otherAddresses
static embedded = ['address', 'otherAddresses']
}
Off the top of my head, you should be able to access these via the object graph. I don't see any reason why you shouldn't.
myFoo.address.myAddressProperty...

Mapping custom types in the ScalaQuery O/R framework

In his comparison of ScalaQuery and Squeryl, Stefan Zeiger (author of ScalaQuery) says in the third bullet-point:
ScalaQuery comes with support for a basic set of JDBC types and can be
extended with DBMS- or application-specific types.
I have been unable to find examples or explanations for how to actually do this, however. I am trying to write a ScalaQuery schema for a Postgres database, in which some columns are of custom enum types that I created within Postgres.
For example, I have a enum type called gender, with possible values male and female. This is NOT a Java enum, persisted to the database as an integer. Rather, it is a custom Postgres type defined within the DBMS. Postgres stores those with a special 4-byte data structure rather than as a primitive.
How could I incorporate Postgres columns of type gender into a ScalaQuery schema?
(I would also appreciate comments, if you think a different strongly-typed O/R approach would be better suited for the task. I have already looked at Squeryl, and do not believe it can handle custom types unless they are persisted as primitives in the DBMS.)
import org.scalaquery.ql.{MappedTypeMapper => Mapper}
object TypeMapper {
type Stamp = java.sql.Timestamp
val joda2Stamp =
Mapper.base[JodaTime, Stamp](
dt => new Stamp(dt.getMillis),
ts => new JodaTime(ts.getTime) )
}
and then, for example, in your DAO (or wherever you run queries), use it:
import TypeMapper._
implicit val j2Stamp = joda2Stamp // type conversion automatically
You'll need to experiment to achieve the same for Enums and PostGres' enum storage type. I tend not to bother, preferring to go with Java Enums and storing as primitive type.
For example:
public enum CardType implements ILabel {
V("Visa"),
M("MasterCard"),
D("Discover"),
A("American Express");
private CardType(String label) { this.label = label; }
public String getLabel() { return this.label; }
final String label;
public static List<String> asList() {
return EnumHelper.asList(CardType.class);
}
public static Map<String,String> asMap() {
return EnumHelper.asMap(CardType.class);
}
}
and then store as char(1) in DB a la Orders.insert(cardType = cardType.toString), or you could create a type mapper Enum-String conversion and omit the enum.toString on inserts...