Alfresco webscript scala controller not being executed - scala

I am experimenting with writing a scala webscript controller in alfresco and I have the following maven dependencies:
<!--scala dependencies -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>LATEST</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>LATEST</version>
</dependency>
<dependency>
<groupId>com.typesafe.scala-logging</groupId>
<artifactId>scala-logging-slf4j_2.11</artifactId>
<version>2.1.2</version>
</dependency>
And the following profile (enabled)
<!--Scala profile-->
<profile>
<id>include-scala</id>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.14.3</version>
<configuration>
<charset>UTF-8</charset>
<jvmArgs>
<jvmArg>-Xmx1024m</jvmArg>
</jvmArgs>
</configuration>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
The controller:
import org.slf4j.LoggerFactory
import com.typesafe.scalalogging.slf4j.Logger
import dk.openesdh.repo.services.cases.CaseService
import dk.openesdh.repo.services.documents.DocumentService
import org.alfresco.service.cmr.repository.{InvalidNodeRefException, NodeRef}
import org.springframework.extensions.webscripts.{Cache, DeclarativeWebScript, Status, WebScriptRequest}
class DocumentCaseContainers(val caseService: CaseService, val documentService:DocumentService) extends DeclarativeWebScript{
protected override def executeImpl(req: WebScriptRequest, status: Status, cache: Cache) : java.util.Map[String, Object] = {
val templateArgs = req.getServiceMatch.getTemplateVars
val logger = Logger(LoggerFactory.getLogger(classOf[DocumentCaseContainers]))
try {
val storeType: String = templateArgs.get ("store_type")
val storeId: String = templateArgs.get ("store_id")
val id: String = templateArgs.get ("id")
val docNodeRefStr = s"$storeType://$storeId/$id"
val documentNode: NodeRef = new NodeRef (docNodeRefStr)
val caseNodeRef = documentService.getCaseNodeRef(documentNode)
val caseDocumentNodeRef = caseService.getDocumentsFolder(caseNodeRef)
val model: Map[java.lang.String, java.lang.Object] = new HashMap[java.lang.String, java.lang.Object] ()
model.put("caseNodeRef", caseNodeRef)
model.put("caseDocumentNodeRef", caseDocumentNodeRef)
model
}
catch {
case inre: InvalidNodeRefException => {
logger.error(inre.getMessage)
null
}
}
}
}
The result is that I get null for the variables in the returned map and more so, the logger statement isn't printed in the logs.
I am currently using the maven 2.0 sdk beta-4 release and running this from within intelliJ IDEA.

Related

Is it possible to generate Q classes by gradle (Kotlin-DSL) for Kotlin MongoDB Documents?

I have a project with Maven, Kotlin, QueryDSL, Spring Boot and MongoDB. It works quite well but I thought that migrating to Gradle could speed up building it. Everything was good before I began moving module with QueryDSL. It turned up that I can not generate Q-classes for Kotlin classes annotated with #Document.
So is there a way to solve it?
Document example (placed /src/main/kotlin/com/company, in kotlin directory):
package ...
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
#Document(collection = "myDocument")
data class MyDocument(
val smth: String
)
maven (piece that responsible for generating)
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<args>
<arg>-Werror</arg>
</args>
<annotationProcessors>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</annotationProcessors>
<compilerPlugins>
<plugin>spring</plugin>
</compilerPlugins>
</configuration>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-noarg</artifactId>
<version>${kotlin.version}</version>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>kapt</id>
<goals>
<goal>kapt</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
For gradle+kotlin AFAIU we have to use kapt to generate Q-classes in this way
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
but it does not work for me, my new build.gradle.kts:
import org.jetbrains.kotlin.gradle.plugin.KotlinSourceSet
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
plugins {
id("org.springframework.boot") version "2.2.0.RELEASE"
id("io.spring.dependency-management") version "1.0.8.RELEASE"
kotlin("jvm") version "1.3.50"
kotlin("kapt") version "1.3.50"
kotlin("plugin.jpa") version "1.3.50"
id("org.jetbrains.kotlin.plugin.spring") version "1.3.21"
}
apply(plugin = "kotlin")
apply(plugin = "kotlin-kapt")
apply(plugin = "kotlin-jpa")
apply(plugin = "org.springframework.boot")
apply(plugin = "io.spring.dependency-management")
group = "com.example"
version = "0.0.1-SNAPSHOT"
java.sourceCompatibility = JavaVersion.VERSION_1_8
repositories {
mavenCentral()
}
dependencies {
implementation("org.springframework.boot:spring-boot-starter-data-jpa")
implementation("com.querydsl:querydsl-jpa")
implementation("com.querydsl:querydsl-apt")
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
kapt("org.springframework.boot:spring-boot-starter-data-jpa")
kapt("org.springframework.boot:spring-boot-configuration-processor")
kapt("org.springframework.data:spring-data-mongodb:2.2.0.RELEASE")
implementation("org.springframework.boot:spring-boot-starter-data-mongodb")
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
testImplementation("org.springframework.boot:spring-boot-starter-test") {
exclude(group = "org.junit.vintage", module = "junit-vintage-engine")
}
}
//sourceSets { main["kotlin"].srcDirs += [generated] }
//val querydslSrcDir = "src/main/generated"
tasks.withType<Test> {
useJUnitPlatform()
}
tasks.withType<KotlinCompile> {
kotlinOptions {
freeCompilerArgs = listOf("-Xjsr305=strict")
jvmTarget = "1.8"
}
}
In maven I can set precisely annotation processor (org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor) but in gradle I can not figure out how to achieve it.
You should add implementation("com.querydsl:querydsl-mongodb") and kapt("com.querydsl:querydsl-apt") in dependencies section.
Then add the following after dependencies section.
kapt {
annotationProcessor("org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor")
}
Also, don't forget to remove those JPA dependencies as well.
This is a working example i created.

java.lang.NoSuchFieldError: ajc$cflowCounter$0

Hi I am using AspectJ maven plugin and weaved the classes successfully at compile time however I am getting following issue:
java.lang.NoSuchFieldError: ajc$cflowCounter$0
Also pointcut as follows:
#Pointcut("execution(* *(..)) && cflowbelow(execution(* com.x.*..*(..)))")
pom.xml
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.11</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.runtime.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.runtime.version}</version>
</dependency>
</dependencies>
<configuration>
<complianceLevel>${maven.compiler.target}</complianceLevel>
<source>${maven.compiler.target}</source>
<target>${maven.compiler.target}</target>
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<Xlint>ignore</Xlint>
<encoding>${project.build.sourceEncoding}</encoding>
<forceAjcCompile>true</forceAjcCompile>
<sources />
<weaveDirectories>
<weaveDirectory>${project.build.directory}/classes</weaveDirectory>
</weaveDirectories>
</configuration>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
Error Log :
SEVERE: Exception sending context initialized event to listener instance of class [org.springframework.web.context.ContextLoaderListener]
java.lang.NoSuchFieldError: ajc$cflowCounter$0
at com.x.util.PSMVPropertiesUtil.processProperties(PSMVPropertiesUtil.java)
at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:86)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:284)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:164)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:693)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:531)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:409)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:291)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:103)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4745)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5207)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1419)
so the PSMVProperties is loaded and picked up at start of applcation
protected void processProperties(ConfigurableListableBeanFactory beanFactory, Properties props) throws BeansException {
super.processProperties(beanFactory, props);
propertiesMap = new HashMap<String, String>();
for (Object key : props.keySet()) {
String keyStr = key.toString();
String valueStr = resolvePlaceholder(keyStr, props, springSystemPropertiesMode);
propertiesMap.put(keyStr, valueStr);
}
}
What is causing this? How to solve this issue?

Webjars not working on vert.x application

What is the proper way to configure webjars in a vert.x application? I have this simple app:
class WebVerticle : CoroutineVerticle() {
override suspend fun start() {
val router = Router.router(vertx)
// Serve static resources from the /assets directory
router.route("/").handler(StaticHandler.create())
val json = ConfigRetriever.create(vertx).getConfigAwait()
val port = json.getInteger("port")
try {
vertx.createHttpServer().requestHandler(router).listenAwait(port)
println("HTTP server started on port $port - redeploy enabled")
} catch (ex: Exception) {
error("Could not spawn web server at port $port")
}
}
}
pom.xml
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-web</artifactId>
<version>${vertx.version}</version>
</dependency>
<dependency>
<groupId>org.webjars</groupId>
<artifactId>vue</artifactId>
<version>2.5.16</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.fabric8</groupId>
<artifactId>vertx-maven-plugin</artifactId>
<version>1.0.13</version>
<executions>
<execution>
<id>vmp</id>
<goals>
<goal>package</goal>
<goal>initialize</goal>
</goals>
</execution>
</executions>
<configuration>
<redeploy>true</redeploy>
</configuration>
</plugin>
</plugins>
</build>
The file structure as following:
src/main/resources/webroot
| index.html
When I hit localhost:8888 it does work but localhost:8888/vue/vue.js does not.
Is there something else that I need to configure?
Change the router configuration to:
router.route().handler(StaticHandler.create("META-INF/resources"))
Then point your browser to:
http://localhost:8888/webjars/vue/2.5.16/vue.js

Kafka: Error serializing Avro message with Schema Registry

I'm trying to send ProducerRecords of my custom type to Kafka, but I'm getting the error:
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.lang.IllegalArgumentException: Unsupported Avro type. Supported types are null, Boolean, Integer, Long, Float, Double, String, byte[] and IndexedRecord
I set up schema in Schema:
GET
http://localhost:8081/subjects/documentCreations-key/versions/3
Response:
{
"subject": "documentCreations-key",
"version": 3,
"id": 1,
"schema": "\"string\""}
GET
http://localhost:8081/subjects/documentCreations-value/versions/4
Response
{
"subject": "documentCreations-value",
"version": 4,
"id": 23,
"schema": "{\"type\":\"record\",\"name\":\"Document\",\"namespace\":\"com.bade\",\"fields\":[{\"name\":\"name\",\"type\":\"string\"},{\"name\":\"path\",\"type\":\"string\"}]}"
}
Here is my Scala class:
class Document(val name: java.lang.String,
val title: java.lang.String,
val path: java.lang.String)
And the part with KafkaProducer:
class MyKafkaProducer {
val props = new Properties()
props.put("bootstrap.servers", "localhost:9092")
props.put("key.serializer", "io.confluent.kafka.serializers.KafkaAvroSerializer")
props.put("value.serializer", "io.confluent.kafka.serializers.KafkaAvroSerializer")
props.put("schema.registry.url", "http://localhost:8081")
private val producer = new KafkaProducer[java.lang.String, Document](props)
def sendCreateDocumentMessage(document: Document): RecordMetadata = {
val documentRecord = new ProducerRecord[java.lang.String, Document](SharedConfig
.documentCreationsTopic,
document.name, document)
producer.send(documentRecord).get()
}
What am I missing? I see that I can implement SpecificRecord for my class, but I didn't see that as necessary in book/tutorials that I've been reading.
Thanks!
EDITED: Fixed class name
Answering my own question. Apparently, (de)serialization is not done automatically (via reflection or something), but you have to generate the class from avro schema file. Posting my pom.xml if it will be helpful to someone:
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<!--force java 8-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.3.1</version>
</plugin>
<plugin>
<!-- Build an executable JAR -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<archive>
<manifest>
<mainClass>Main</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>${avro.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>schema</goal>
<goal>protocol</goal>
<goal>idl-protocol</goal>
</goals>
<configuration>
<sourceDirectory>src/main/avro
</sourceDirectory>
</configuration>
</execution>
</executions>
</plugin>
<!--force discovery of generated classes-->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-sources/avro</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>confluent</id>
<url>http://packages.confluent.io/maven/</url>
</repository>
</repositories>
<properties>
<kafka.version>1.0.0</kafka.version>
<confluent.version>4.0.0</confluent.version>
<avro.version>1.8.2</avro.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>${kafka.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>${kafka.version}</version>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>${confluent.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>${avro.version}</version>
</dependency>
</dependencies>
I build it with following mvn commands:
mvn clean:clean avro:schema compiler:compile scala:compile jar:jar

jOOQ code generation Keys.java member ordering

For the given jOOQ Maven code generation:
<profile>
<id>code-generation</id>
<build>
<plugins>
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<version>${jooq.version}</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.4-1200-jdbc41</version>
</dependency>
</dependencies>
<configuration>
<jdbc>
<driver>org.postgresql.Driver</driver>
<url>${postgres.connection_string}</url>
<user>${postgres.user}</user>
<password></password>
</jdbc>
<generator>
<name>org.jooq.util.JavaGenerator</name>
<database>
<name>org.jooq.util.postgres.PostgresDatabase</name>
<includes>.*</includes>
<excludes></excludes>
<inputSchema>public</inputSchema>
</database>
<target>
<packageName>${packagename_123}</packageName>
<directory>src/main/java/</directory>
</target>
</generator>
</configuration>
</plugin>
</plugins>
</build>
</profile>
The generated Keys.java generates the following on my local developer machine:
public static final UniqueKey<BillItemRecord> BILL_ITEM_PKEY = UniqueKeys0.BILL_ITEM_PKEY;
public static final UniqueKey<BillingHistoryRecord> BILLING_HISTORY_PKEY = UniqueKeys0.BILLING_HISTORY_PKEY;
On my CI machine the same Keys.java file has sorted the two lines in the opposite order:
public static final UniqueKey<BillingHistoryRecord> BILLING_HISTORY_PKEY = UniqueKeys0.BILLING_HISTORY_PKEY;
public static final UniqueKey<BillItemRecord> BILL_ITEM_PKEY = UniqueKeys0.BILL_ITEM_PKEY;
Why are these two lines being sorted differently?