Scaladoc: #group tag not showing in API documentation - scala

I'm trying to organise the members of a class in my library API documentation using #groupname and #group tags, but it doesn't work (I'm using sbt 0.13.11)
My toy build.sbt:
name := "test"
scalaVersion := "2.10.5"
My toy code src/main/scala/test.scala:
package test
/** Test class
*
* #groupname print Printer
* #groupname throw Thrower
*/
class TestClass {
/** #group print */
def trivialPrint: Unit = print("Hello")
/** #group throw */
def trivialError: Unit = throw new Exception("Hello")
}
sbt doc compiles an API doc where both my functions are in the "Value Members" group of the class (cf. screenshot). What am I doing wrong?

Prior to Scala 2.11 you have to explicitly ask for Scaladoc grouping support in your build.sbt:
name := "test"
scalaVersion := "2.10.5"
scalacOptions += "-groups"
You could scope it to in (Compile, doc), but I'm not sure it matters much.
Like most things related to Scaladoc this is essentially undocumented, but it works.

At least for Scala 2.11.x, it seems like we do still need to ask for it specifically. Consider the following in your build.sbt:
/* Normal scalac options */
scalacOptions := Seq(
"-deprecation",
"-Ypartial-unification",
"-Ywarn-value-discard",
"-Ywarn-unused-import",
"-Ywarn-dead-code",
"-Ywarn-numeric-widen"
)
/* Only invoked when you do `doc` in SBT */
scalacOptions in (Compile, doc) += "-groups"
And then your example as you have it should work.

As per other answers. For maven <arg>-groups</arg>. Here is the maven version:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
<execution>
<id>Scaladoc</id>
<goals>
<goal>doc</goal>
</goals>
<phase>prepare-package</phase>
<configuration>
<args>
<arg>-no-link-warnings</arg>
<arg>-groups</arg>
</args>
</configuration>
</execution>
</executions>

Related

an openapi client jar, generated for scala-akka by openapi-generator-maven-plugin, causes canBuildFromIterableViewMapLike() not found, under Zeppelin

I'm generating a Scala client jar with org.openapitools:openapi-generator-maven-plugin so that my Zeppelin will be able to execute this sample of code,
openapi-generator-maven-plugin offers few generators for Scala, and recommends to choose scala-akka.
It requires to produce a complicated piece of client code:
%spark
import java.util.Map;
import fr.ecoemploi.application.etude.swagger.invoker._
import fr.ecoemploi.application.etude.swagger.model.Commune;
import fr.ecoemploi.application.etude.swagger.api.CogControllerApi;
import fr.ecoemploi.application.etude.swagger.invoker.CollectionFormats._
import fr.ecoemploi.application.etude.swagger.invoker.ApiKeyLocations._
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.Future
import scala.util.{Failure, Success}
implicit val system = ActorSystem("system")
implicit val executionContext = system.dispatcher
implicit val materializer: ActorMaterializer = ActorMaterializer()
val apiInvoker = ApiInvoker()
val cogService = CogControllerApi("http://localhost:9090")
val request = cogService.obtenirCommunes(2022);
val response = apiInvoker.execute(request)
response.onComplete {
case Success(ApiResponse(code, content, headers)) =>
System.out.println(s"Status code: $code}")
System.out.println(s"Response headers: ${headers.mkString(", ")}")
System.out.println(s"Response body: $content")
case Failure(error # ApiError(code, message, responseContent, cause, headers)) =>
System.err.println("Exception when calling CogControllerApi#obtenirCommunes")
System.err.println(s"Status code: $code}")
System.err.println(s"Reason: $responseContent")
System.err.println(s"Response headers: ${headers.mkString(", ")}")
error.printStackTrace();
case Failure(exception) =>
System.err.println("Exception when calling CogControllerApi#obtenirCommunes")
exception.printStackTrace();
}
With eventually a run failure under Zeppelin:
java.lang.NoSuchMethodError: 'scala.collection.generic.CanBuildFrom scala.collection.compat.package$.canBuildFromIterableViewMapLike()'
at fr.ecoemploi.application.etude.swagger.invoker.ApiInvoker.makeUri(ApiInvoker.scala:213)
at fr.ecoemploi.application.etude.swagger.invoker.ApiInvoker.execute(ApiInvoker.scala:226)
... 44 elided
How I've managed things:
Its generated openapi.jar has this content:
jar tf openapi-client-1.0.0.jar
META-INF/
META-INF/MANIFEST.MF
fr/
fr/ecoemploi/
fr/ecoemploi/application/
fr/ecoemploi/application/etude/
fr/ecoemploi/application/etude/swagger/
fr/ecoemploi/application/etude/swagger/model/
fr/ecoemploi/application/etude/swagger/api/
fr/ecoemploi/application/etude/swagger/invoker/
fr/ecoemploi/application/etude/swagger/model/Commune$.class
fr/ecoemploi/application/etude/swagger/model/Commune.class
fr/ecoemploi/application/etude/swagger/api/CogControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/ComptesCollectivitesControllerApi.class
fr/ecoemploi/application/etude/swagger/api/ComptesCollectivitesControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/EnumsSerializers$.class
fr/ecoemploi/application/etude/swagger/api/EnumsSerializers.class
fr/ecoemploi/application/etude/swagger/api/CovidControllerApi.class
fr/ecoemploi/application/etude/swagger/api/EquipementControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/CovidControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/EnumsSerializers$EnumNameSerializer.class
fr/ecoemploi/application/etude/swagger/api/IntercoControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/ActivitesControllerApi.class
fr/ecoemploi/application/etude/swagger/api/EnumsSerializers$EnumNameSerializer$$anonfun$deserialize$1.class
fr/ecoemploi/application/etude/swagger/api/ActivitesControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/AssociationsControllerApi$.class
fr/ecoemploi/application/etude/swagger/api/IntercoControllerApi.class
fr/ecoemploi/application/etude/swagger/api/EnumsSerializers$EnumNameSerializer$$anonfun$serialize$1.class
fr/ecoemploi/application/etude/swagger/api/EquipementControllerApi.class
fr/ecoemploi/application/etude/swagger/api/CogControllerApi.class
fr/ecoemploi/application/etude/swagger/api/AssociationsControllerApi.class
fr/ecoemploi/application/etude/swagger/invoker/ApiRequest$.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormat.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$LocalDateSerializer$.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$DateTimeSerializer$$anonfun$$lessinit$greater$1.class
fr/ecoemploi/application/etude/swagger/invoker/NumericValue.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyCredentials.class
fr/ecoemploi/application/etude/swagger/invoker/CustomContentTypes.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyValue.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$DateTimeSerializer$$anonfun$$lessinit$greater$1$$anonfun$apply$1.class
fr/ecoemploi/application/etude/swagger/invoker/ApiError$$anonfun$$lessinit$greater$1.class
fr/ecoemploi/application/etude/swagger/invoker/Credentials.class
fr/ecoemploi/application/etude/swagger/invoker/ParametersMap$.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$CSV$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyCredentials$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiModel.class
fr/ecoemploi/application/etude/swagger/invoker/ApiMethods.class
fr/ecoemploi/application/etude/swagger/invoker/ParametersMap.class
fr/ecoemploi/application/etude/swagger/invoker/BasicCredentials.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$DateTimeSerializer$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiMethod.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocations$HEADER$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyValue$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiInvoker$$anonfun$unmarshallApiResponse$4.class
fr/ecoemploi/application/etude/swagger/invoker/ApiSettings.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocations$QUERY$.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$SSV$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiInvoker$.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$LocalDateSerializer$$anonfun$$lessinit$greater$2.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$DateTimeSerializer$$anonfun$$lessinit$greater$1$$anonfun$apply$2.class
fr/ecoemploi/application/etude/swagger/invoker/ApiResponse.class
fr/ecoemploi/application/etude/swagger/invoker/ApiRequest.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocation.class
fr/ecoemploi/application/etude/swagger/invoker/ApiInvoker$ApiRequestImprovements.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$PIPES$.class
fr/ecoemploi/application/etude/swagger/invoker/UnitJSONSupport.class
fr/ecoemploi/application/etude/swagger/invoker/BearerToken$.class
fr/ecoemploi/application/etude/swagger/invoker/BasicCredentials$.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$.class
fr/ecoemploi/application/etude/swagger/invoker/NumericValue$.class
fr/ecoemploi/application/etude/swagger/invoker/MergedArrayFormat.class
fr/ecoemploi/application/etude/swagger/invoker/ApiResponse$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocations$COOKIE$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiMethod$.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$LocalDateSerializer$$anonfun$$lessinit$greater$2$$anonfun$apply$4.class
fr/ecoemploi/application/etude/swagger/invoker/BearerToken.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocations.class
fr/ecoemploi/application/etude/swagger/invoker/ResponseState$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiInvoker.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$TSV$.class
fr/ecoemploi/application/etude/swagger/invoker/ArrayValues.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$.class
fr/ecoemploi/application/etude/swagger/invoker/ResponseState$Error$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiError$.class
fr/ecoemploi/application/etude/swagger/invoker/ParametersMap$ParametersMapImprovements.class
fr/ecoemploi/application/etude/swagger/invoker/ApiMethods$.class
fr/ecoemploi/application/etude/swagger/invoker/CollectionFormats$MULTI$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiInvoker$ApiMethodExtensions.class
fr/ecoemploi/application/etude/swagger/invoker/ApiKeyLocations$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiReturnWithHeaders.class
fr/ecoemploi/application/etude/swagger/invoker/ApiSettings$.class
fr/ecoemploi/application/etude/swagger/invoker/ResponseState.class
fr/ecoemploi/application/etude/swagger/invoker/ApiError.class
fr/ecoemploi/application/etude/swagger/invoker/ResponseState$Success$.class
fr/ecoemploi/application/etude/swagger/invoker/ApiError$$anonfun$$lessinit$greater$2.class
fr/ecoemploi/application/etude/swagger/invoker/ArrayValues$.class
fr/ecoemploi/application/etude/swagger/invoker/Serializers$LocalDateSerializer$$anonfun$$lessinit$greater$2$$anonfun$apply$3.class
reference.conf
META-INF/maven/
META-INF/maven/org.openapitools/
META-INF/maven/org.openapitools/openapi-client/
META-INF/maven/org.openapitools/openapi-client/pom.xml
META-INF/maven/org.openapitools/openapi-client/pom.properties
The lib folder accompanying the generated sources when they are compiled, has this content:
akka-actor_2.12-2.6.12.jar json4s-core_2.12-3.6.7.jar scalatest-core_2.12-3.2.3.jar
akka-http_2.12-10.2.3.jar json4s-ext_2.12-3.6.7.jar scalatest-diagrams_2.12-3.2.3.jar
akka-http-core_2.12-10.2.3.jar json4s-jackson_2.12-3.6.7.jar scalatest-featurespec_2.12-3.2.3.jar
akka-http-json4s_2.12-1.27.0.jar json4s-scalap_2.12-3.6.7.jar scalatest-flatspec_2.12-3.2.3.jar
akka-parsing_2.12-10.2.3.jar junit-4-13_2.12-3.2.3.0.jar scalatest-freespec_2.12-3.2.3.jar
akka-protobuf-v3_2.12-2.6.12.jar junit-4.13.jar scalatest-funspec_2.12-3.2.3.jar
akka-stream_2.12-2.6.12.jar paranamer-2.8.jar scalatest-funsuite_2.12-3.2.3.jar
config-1.4.1.jar reactive-streams-1.0.3.jar scalatest-matchers-core_2.12-3.2.3.jar
hamcrest-core-1.3.jar scala-collection-compat_2.12-2.4.1.jar scalatest-mustmatchers_2.12-3.2.3.jar
hpack-1.0.2.jar scalactic_2.12-3.2.3.jar scalatest-propspec_2.12-3.2.3.jar
jackson-annotations-2.9.0.jar scala-java8-compat_2.12-0.8.0.jar scalatest-refspec_2.12-3.2.3.jar
jackson-core-2.9.8.jar scala-library-2.12.13.jar scalatest-shouldmatchers_2.12-3.2.3.jar
jackson-databind-2.9.8.jar scala-parser-combinators_2.12-1.1.2.jar scalatest-wordspec_2.12-3.2.3.jar
joda-convert-2.2.0.jar scala-reflect-2.12.12.jar scala-xml_2.12-1.2.0.jar
joda-time-2.10.1.jar scalatest_2.12-3.2.3.jar ssl-config-core_2.12-0.4.2.jar
json4s-ast_2.12-3.6.7.jar scalatest-compatible-3.2.3.jar
It appears that on the spark.jars attribute of the Zeppelin Scala interpreter, I shall not only add
openapi-client-1.0.0.jar, but also some of these jars listed.
But not all of them, of course...
I have joined those to spark.jars attribute of Zeppelin:
spark.jars /home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/openapi-client-1.0.0.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-actor_2.12-2.6.12.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-http_2.12-10.2.3.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-http-core_2.12-10.2.3.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-http-json4s_2.12-1.27.0.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-parsing_2.12-10.2.3.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-protobuf-v3_2.12-2.6.12.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/akka-stream_2.12-2.6.12.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/config-1.4.1.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/hamcrest-core-1.3.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/hpack-1.0.2.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/paranamer-2.8.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/reactive-streams-1.0.3.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/jackson-annotations-2.9.0.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/jackson-core-2.9.8.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/jackson-databind-2.9.8.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/joda-convert-2.2.0.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/joda-time-2.10.1.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/json4s-ast_2.12-3.6.7.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/json4s-core_2.12-3.6.7.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/json4s-ext_2.12-3.6.7.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/json4s-jackson_2.12-3.6.7.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/json4s-scalap_2.12-3.6.7.jar,
/home/lebihan/dev/Java/comptes-france/metier-et-gestion/dev/GenerationOpenAPI/target/generated-sources/swagger/scala/target/lib/ssl-config-core_2.12-0.4.2.jar
May be I souldn't use scala-akka for client generator?
Or, am I on the wrong way for generation, and Apache Zeppelin requires another generator selection that scala-akka, for the openapi-client.jar to use with it?
Epilogue
Scala is a hell of incompatible jars, even in the middle of its 2.12.x or 2.13.x versions.
It happened that Spark (used by Apache Zeppelin) was using incompatible Scala jars from scala.collection.compat package, some coming through com.google.protobuf, did I red.
So the workaround is to create a fat jar relocating these classes elsewhere, renaming their packages.
Then, the client-openapi.jar created will be accepted.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass></mainClass>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/maven/**</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>com</pattern>
<shadedPattern>repackaged.com.google.common</shadedPattern>
<includes>
<include>com.google.common.**</include>
</includes>
</relocation>
<relocation>
<pattern>com.google.protobuf</pattern>
<shadedPattern>com.shaded.protobuf</shadedPattern>
<includes>
<include>com.google.protobuf.**</include>
</includes>
</relocation>
<relocation>
<pattern>scala.collection.compat</pattern>
<shadedPattern>scala.shaded.compat</shadedPattern>
<includes>
<include>scala.collection.compat.**</include>
</includes>
</relocation>
<relocation>
<pattern>shapeless</pattern>
<shadedPattern>shapelessshaded</shadedPattern>
<includes>
<include>shapeless.**</include>
</includes>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>

EBean enhancement issue [POJO not enhanced]

I am Using Ebean and Vert.x for my cron jobs.
But for some reason Entities are not being enhanced by ebean-maven-plugin.
Here is what I am using:
<plugin>
<groupId>io.ebean</groupId>
<artifactId>ebean-maven-plugin</artifactId>
<version>12.1.12</version>
<executions>
<execution>
<id>main</id>
<phase>process-classes</phase>
<configuration>
<transformArgs>debug=1</transformArgs>
</configuration>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
</plugin>
My entity:
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
#Entity
#Table(name = "targeting_locations")
public class TargetingLocations {
#Id
#Column(name = "id")
public Long id;
// other properties
}
Here is the error code:
2020-04-15 19:39:26,671 i.e.s.d.BeanDescriptorManager - Error in deployment
java.lang.IllegalStateException: Bean class com.xxx.model.TargetingLocations is not enhanced?
at io.ebeaninternal.server.deploy.BeanDescriptorManager.setEntityBeanClass(BeanDescriptorManager.java:1414)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.createByteCode(BeanDescriptorManager.java:1286)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.readDeployAssociations(BeanDescriptorManager.java:1208)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.readEntityDeploymentAssociations(BeanDescriptorManager.java:711)
From different posts, could not really figure out what is causing this issue.
Your domain should extend io.ebean.Model
Rest looks fine.
Hi I have the same problem recently and I change from ebean-maven-plugin to tiles-maven-plugin. It will enchance your Ebean Entity. This is the plugin:
<plugin>
<groupId>io.repaint.maven</groupId>
<artifactId>tiles-maven-plugin</artifactId>
<version>2.10</version>
<extensions>true</extensions>
<configuration>
<tiles>
<tile>org.avaje.tile:java-compile:1.1</tile>
<tile>io.ebean.tile:enhancement:5.3</tile>
</tiles>
</configuration>
</plugin>
By using tiles-maven-plugin you only need this ebean dependency:
<dependency>
<groupId>io.ebean</groupId>
<artifactId>ebean</artifactId>
<version>11.22.4</version>
</dependency>
Hope it helps. Thank you.

Is it possible to generate Q classes by gradle (Kotlin-DSL) for Kotlin MongoDB Documents?

I have a project with Maven, Kotlin, QueryDSL, Spring Boot and MongoDB. It works quite well but I thought that migrating to Gradle could speed up building it. Everything was good before I began moving module with QueryDSL. It turned up that I can not generate Q-classes for Kotlin classes annotated with #Document.
So is there a way to solve it?
Document example (placed /src/main/kotlin/com/company, in kotlin directory):
package ...
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
#Document(collection = "myDocument")
data class MyDocument(
val smth: String
)
maven (piece that responsible for generating)
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<args>
<arg>-Werror</arg>
</args>
<annotationProcessors>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</annotationProcessors>
<compilerPlugins>
<plugin>spring</plugin>
</compilerPlugins>
</configuration>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-noarg</artifactId>
<version>${kotlin.version}</version>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>kapt</id>
<goals>
<goal>kapt</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
For gradle+kotlin AFAIU we have to use kapt to generate Q-classes in this way
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
but it does not work for me, my new build.gradle.kts:
import org.jetbrains.kotlin.gradle.plugin.KotlinSourceSet
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
plugins {
id("org.springframework.boot") version "2.2.0.RELEASE"
id("io.spring.dependency-management") version "1.0.8.RELEASE"
kotlin("jvm") version "1.3.50"
kotlin("kapt") version "1.3.50"
kotlin("plugin.jpa") version "1.3.50"
id("org.jetbrains.kotlin.plugin.spring") version "1.3.21"
}
apply(plugin = "kotlin")
apply(plugin = "kotlin-kapt")
apply(plugin = "kotlin-jpa")
apply(plugin = "org.springframework.boot")
apply(plugin = "io.spring.dependency-management")
group = "com.example"
version = "0.0.1-SNAPSHOT"
java.sourceCompatibility = JavaVersion.VERSION_1_8
repositories {
mavenCentral()
}
dependencies {
implementation("org.springframework.boot:spring-boot-starter-data-jpa")
implementation("com.querydsl:querydsl-jpa")
implementation("com.querydsl:querydsl-apt")
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
kapt("org.springframework.boot:spring-boot-starter-data-jpa")
kapt("org.springframework.boot:spring-boot-configuration-processor")
kapt("org.springframework.data:spring-data-mongodb:2.2.0.RELEASE")
implementation("org.springframework.boot:spring-boot-starter-data-mongodb")
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
testImplementation("org.springframework.boot:spring-boot-starter-test") {
exclude(group = "org.junit.vintage", module = "junit-vintage-engine")
}
}
//sourceSets { main["kotlin"].srcDirs += [generated] }
//val querydslSrcDir = "src/main/generated"
tasks.withType<Test> {
useJUnitPlatform()
}
tasks.withType<KotlinCompile> {
kotlinOptions {
freeCompilerArgs = listOf("-Xjsr305=strict")
jvmTarget = "1.8"
}
}
In maven I can set precisely annotation processor (org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor) but in gradle I can not figure out how to achieve it.
You should add implementation("com.querydsl:querydsl-mongodb") and kapt("com.querydsl:querydsl-apt") in dependencies section.
Then add the following after dependencies section.
kapt {
annotationProcessor("org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor")
}
Also, don't forget to remove those JPA dependencies as well.
This is a working example i created.

How does one use JMH from a Scala Maven project in a JUnit test?

I created a complete test project (version in question linked). I have based this project on several things:
The original java repository of the same demo
Generating an archetype, e.g.:
mvn archetype:generate \
-DinteractiveMode=false \
-DarchetypeGroupId=org.openjdk.jmh \
-DarchetypeArtifactId=jmh-scala-benchmark-archetype \
-DgroupId=org.sample \
-DartifactId=test \
-Dversion=1.0
One way to get the test runner working with JMH (otherwise doesn't even try to run): JMH Unable to find the resource: /META-INF/BenchmarkList
Further confirmation in methodology: How to run JMH from inside JUnit tests?
Gradle/idea has similar issues, maybe.
Maybe others ... I've been staring at this for many hours, but I think that is it.
Here is the test:
package com.szatmary.peter
import org.junit.Assert
import org.junit.Test
import org.openjdk.jmh.annotations.Benchmark
import org.openjdk.jmh.annotations.Mode
import org.openjdk.jmh.annotations.Scope
import org.openjdk.jmh.annotations.State
import org.openjdk.jmh.results.BenchmarkResult
import org.openjdk.jmh.results.RunResult
import org.openjdk.jmh.runner.Runner
import org.openjdk.jmh.runner.RunnerException
import org.openjdk.jmh.runner.options.Options
import org.openjdk.jmh.runner.options.OptionsBuilder
import org.openjdk.jmh.runner.options.TimeValue
import org.openjdk.jmh.runner.options.VerboseMode
import java.util
import java.util.concurrent.TimeUnit
import com.szatmary.peter.SampleBenchmarkTest.St
import com.szatmary.peter.SampleBenchmarkTest.St.AVERAGE_EXPECTED_TIME
/**
* It is recommended to run JMH not from tests but directly from different main method code.
* Unit-tests and other IDE interferes with the measurements.
*
* If your measurements will be in second / minutes or longer than it running nechmarks from tests
* will not affect your benchmark results.
*
* If your measurements will be in miliseconds, microseconds , nanoseconds ... run your
* benchmarks rather not from tests bud from main code to have better measurements.
*/
object SampleBenchmarkTest {
#State(Scope.Benchmark) object St {
private[peter] val AVERAGE_EXPECTED_TIME = 100 // expected max 100 milliseconds
val app = new App
}
}
class SampleBenchmarkTest {
/**
* Benchmark run with Junit
*
* #throws Exception
*/
#Test
#throws[Exception]
def runTest(): Unit = {
val opt = initBench
val results = runBench(opt)
assertOutputs(results)
}
/**
* JMH benchmark
*
*/
#Benchmark
def oldWay(st: St.type): Unit = st.app.oldWay
#Benchmark
def newWay(st: St.type): Unit = st.app.newWay
/**
* Runner options that runs all benchmarks in this test class
* namely benchmark oldWay and newWay.
*
* #return
*/
private def initBench: Options = {
System.out.println("running" + classOf[SampleBenchmarkTest].getSimpleName + ".*")
return new OptionsBuilder()
.include(classOf[SampleBenchmarkTest].getSimpleName + ".*")
.mode(Mode.AverageTime)
.verbosity(VerboseMode.EXTRA)
.timeUnit(TimeUnit.MILLISECONDS)
.warmupTime(TimeValue.seconds(1))
.measurementTime(TimeValue.milliseconds(1))
.measurementIterations(2).
threads(4)
.warmupIterations(2)
.shouldFailOnError(true)
.shouldDoGC(true)
.forks(1)
.build
}
/**
*
* #param opt
* #return
* #throws RunnerException
*/
#throws[RunnerException]
private def runBench(opt: Options) = new Runner(opt).run
/**
* Assert benchmark results that are interesting for us
* Asserting test mode and average test time
*
* #param results
*/
private def assertOutputs(results: util.Collection[RunResult]) = {
import scala.collection.JavaConversions._
for (r <- results) {
import scala.collection.JavaConversions._
for (rr <- r.getBenchmarkResults) {
val mode = rr.getParams.getMode
val score = rr.getPrimaryResult.getScore
val methodName = rr.getPrimaryResult.getLabel
Assert.assertEquals("Test mode is not average mode. Method = " + methodName, Mode.AverageTime, mode)
Assert.assertTrue("Benchmark score = " + score + " is higher than " + AVERAGE_EXPECTED_TIME + " " + rr.getScoreUnit + ". Too slow performance !", score < AVERAGE_EXPECTED_TIME)
}
}
}
}
The error from running mvn clean install is:
[INFO] --- exec-maven-plugin:1.6.0:exec (run-benchmarks) # jmh-benchmark-demo ---
Exception in thread "main" java.lang.RuntimeException: ERROR: Unable to find the resource: /META-INF/BenchmarkList
at org.openjdk.jmh.runner.AbstractResourceReader.getReaders(AbstractResourceReader.java:98)
at org.openjdk.jmh.runner.BenchmarkList.find(BenchmarkList.java:122)
at org.openjdk.jmh.runner.Runner.internalRun(Runner.java:263)
at org.openjdk.jmh.runner.Runner.run(Runner.java:209)
at org.openjdk.jmh.Main.main(Main.java:71)
[ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
Unless requested, I won't bother including the other source file in this post (App.scala - it just has two ways of summing over an array), and it is found in the github repo.
For completeness here is my current pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.szatmary.peter</groupId>
<artifactId>jmh-benchmark-demo</artifactId>
<version>1.0</version>
<name>JMH benchmark demo</name>
<prerequisites>
<maven>3.0</maven>
</prerequisites>
<properties>
<javac.target>1.8</javac.target>
<scala.version.major>2.11</scala.version.major>
<scala.version.minor>11</scala.version.minor>
<scala.version>
${scala.version.major}.${scala.version.minor}
</scala.version>
<!--
Select a JMH benchmark generator to use. Available options:
default: whatever JMH chooses by default;
asm: parse bytecode with ASM;
reflection: load classes and use Reflection over them;
-->
<jmh.generator>default</jmh.generator>
<!--
Name of the benchmark Uber-JAR to generate.
-->
<uberjar.name>benchmarks</uberjar.name>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<jmh.version>1.20</jmh.version>
<java.version>1.8</java.version>
<junit.version>4.12</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-core</artifactId>
<version>${jmh.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-generator-annprocess</artifactId>
<version>${jmh.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<!--
1. Add source directories for both scalac and javac.
-->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${project.basedir}/src/main/scala</source>
<source>${project.basedir}/target/generated-sources/jmh</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-test-source</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>${project.basedir}/src/test/scala</source>
<source>${project.basedir}/target/generated-test-sources/jmh</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<!--
2. Compile Scala sources
-->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.3.1</version>
<configuration>
<recompileMode>incremental</recompileMode>
</configuration>
<executions>
<execution>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<!--
4. Compile JMH generated code.
-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<compilerVersion>${javac.target}</compilerVersion>
<source>${javac.target}</source>
<target>${javac.target}</target>
<compilerArgument>-proc:none</compilerArgument>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>run-benchmarks</id>
<phase>integration-test</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<classpathScope>test</classpathScope>
<executable>java</executable>
<arguments>
<argument>-classpath</argument>
<classpath />
<argument>org.openjdk.jmh.Main</argument>
<argument>.*</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

TypeSafe config : reference.conf values not overridden by application.conf

I am using typesafe application conf to provide all external connfig in my scala project, but While trying to create shaded Jar using maven-shade-plugin and running it somehow package the conf in jar itself which cannot be overridden while changing the values in application conf.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven.shade.plugin.version}</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>test.myproj.DQJobTrigger</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
I am not sure of the behaviour when I am trying to load all configs from the application conf itself. using ConfigFactory
trait Config {
private val dqConfig = System.getenv("CONF_PATH")
private val config = ConfigFactory.parseFile(new File(dqConfig))
private val sparkConfig = config.getConfig("sparkConf")
private val sparkConfig = config.getConfig("mysql")
}
while CONF_PATH is set as the path where application.conf exist while running the jar.
application.conf
sparkConf = {
master = "yarn-client"
}
mysql = {
url = "jdbc:mysql://127.0.0.1:3306/test"
user = "root"
password = "MyCustom98"
driver = "com.mysql.jdbc.Driver"
connectionPool = disabled
keepAliveConnection = true
}
so now Even If I change the properties in the application conf still it takes the configs which were present while packaging the jar.