overloaded constructor for FlinkKafkaProducer - scala

I'm trying to use Scala and Flink to publish messages into a Kafka topic. However, when creating the FlinkKafkaProducer object using the code provided in the documentation, it tells me that the contructor cannot be applied. This is the code sample:
val studentProducer = new FlinkKafkaProducer[String](
"my_topic", // target topic
new SimpleStringSchema(), // serialization schema
properties, // producer config
FlinkKafkaProducer.Semantic.EXACTLY_ONCE) // fault-tolerance
With the following imports:
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer, FlinkKafkaProducer}
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
import java.util.Properties
And this is the error I'm getting:
/home/user/Flink/flinkproj/src/main/scala/org/flink/Job.scala:83:27: overloaded method constructor FlinkKafkaProducer with alternatives:
[error] (x$1: String,x$2: org.apache.flink.streaming.connectors.kafka.KafkaSerializationSchema[String],x$3: java.util.Properties,x$4: org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.Semantic)org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer[String] <and>
[error] (x$1: String,x$2: org.apache.flink.streaming.util.serialization.KeyedSerializationSchema[String],x$3: java.util.Properties,x$4: java.util.Optional[org.apache.flink.streaming.connectors.kafka.partitioner.FlinkKafkaPartitioner[String]])org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer[String] <and>
[error] (x$1: String,x$2: org.apache.flink.streaming.util.serialization.KeyedSerializationSchema[String],x$3: java.util.Properties,x$4: org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.Semantic)org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer[String] <and>
[error] (x$1: String,x$2: org.apache.flink.api.common.serialization.SerializationSchema[String],x$3: java.util.Properties,x$4: java.util.Optional[org.apache.flink.streaming.connectors.kafka.partitioner.FlinkKafkaPartitioner[String]])org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer[String]
[error] cannot be applied to (String, org.apache.flink.streaming.util.serialization.SimpleStringSchema, java.util.Properties, org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.Semantic)
[error] val studentProducer = new FlinkKafkaProducer[String](
The variable properties is an instance of java.util.Properties. I think it has to be in the String serializer, but I don't see what is wrong.
The details on the versions are the following, from the build.sbt:
ThisBuild / scalaVersion := "2.11.8"
val flinkVersion = "1.11.3"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-clients" % flinkVersion % "provided",
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion % "provided")

I believe the documentation is out-of-date, and you need to either provide a KafkaSerializationSchema or a KeyedSerializationSchema -- or if you use a SerializationSchema, then you need to also supply a FlinkKafkaPartitioner.
I don't happen to have an example in Scala, but here's an example in Java showing how to implement a KafkaSerializationSchema that uses an ObjectMapper to write out JSON:
/**
* A Kafka {#link KafkaSerializationSchema} to serialize {#link ClickEventStatistics}s as JSON.
*
*/
public class ClickEventStatisticsSerializationSchema implements KafkaSerializationSchema<ClickEventStatistics> {
private static final ObjectMapper objectMapper = new ObjectMapper();
private String topic;
public ClickEventStatisticsSerializationSchema(){
}
public ClickEventStatisticsSerializationSchema(String topic) {
this.topic = topic;
}
#Override
public ProducerRecord<byte[], byte[]> serialize(
final ClickEventStatistics message, #Nullable final Long timestamp) {
try {
//if topic is null, default topic will be used
return new ProducerRecord<>(topic, objectMapper.writeValueAsBytes(message));
} catch (JsonProcessingException e) {
throw new IllegalArgumentException("Could not serialize record: " + message, e);
}
}
}

Related

Can't find SttpBackends + "Error occurred in an application involving default arguments."

I'm trying to create a extremely simple Telegram bot in Scala using bot4s. I'm pretty much following the example there. Here's the code:
package info.jjmerelo.BoBot
import cats.instances.future._
import cats.syntax.functor._
import com.bot4s.telegram.api.RequestHandler
import com.bot4s.telegram.api.declarative.Commands
import com.bot4s.telegram.clients.{FutureSttpClient, ScalajHttpClient}
import com.bot4s.telegram.future.{Polling, TelegramBot}
import scala.util.Try
import scala.concurrent.Future
import com.typesafe.scalalogging.Logger
object BoBot extends TelegramBot
with Polling
with Commands[Future] {
implicit val backend = SttpBackends.default
def token = sys.env("BOBOT_TOKEN")
override val client: RequestHandler[Future] = new FutureSttpClient(token)
val log = Logger("BoBot")
// val lines = scala.io.Source.fromFile("hitos.json").mkString
// val hitos = JSON.parseFull( lines )
// val solo_hitos = hitos.getOrElse( hitos )
onCommand("hey") { implicit msg =>
log.info("Hello")
reply("Conseguí que funcionara").void
}
}
And here's the build.sbt
name := "bobot"
version := "0.0.1"
organization := "info.jjmerelo"
libraryDependencies += "com.bot4s" %% "telegram-core" % "4.4.0-RC2"
val circeVersion = "0.12.3"
libraryDependencies ++= Seq(
"io.circe" %% "circe-core",
"io.circe" %% "circe-generic",
"io.circe" %% "circe-parser"
).map(_ % circeVersion)
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
retrieveManaged := true
Circe is for later
Anyway, I managed to compile most of it, but I still get these two errors:
[info] compiling 2 Scala sources to /home/jmerelo/Asignaturas/cloud-computing/BoBot/target/scala-2.12/classes ...
[error] /home/jmerelo/Asignaturas/cloud-computing/BoBot/src/main/scala/info/jjmerelo/BoBot.scala:21:26: not found: value SttpBackends
[error] implicit val backend = SttpBackends.default
[error] ^
[error] /home/jmerelo/Asignaturas/cloud-computing/BoBot/src/main/scala/info/jjmerelo/BoBot.scala:23:49: could not find implicit value for parameter backend: com.softwaremill.sttp.SttpBackend[scala.concurrent.Future,Nothing]
[error] Error occurred in an application involving default arguments.
[error] override val client: RequestHandler[Future] = new FutureSttpClient(token)
[error] ^
[error] two errors found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 5 s, completed 11 nov. 2020 8:19:38
I can't figure out either of the two. SttpBackends is missing, that's clear, but there's nothing in the example that indicates it's needed, or, for that matter, what library should be included. The second one about the default arguments I simply can't figure it out, even if I define token as String or if I change def to val. Any idea?
Your error messages is associated with each other.
First error tells us that compiler couldn't find object SttpBackends which has field of SttpBackend.
The second one tells us that compiler couldn't find implicit backend: SttpBackend for constructing FutureSttpClient. It requires two implicits: SttpBackend and ExecutionContext.
class FutureSttpClient(token : _root_.scala.Predef.String,
telegramHost : _root_.scala.Predef.String = { /* compiled code */ })
(implicit backend : com.softwaremill.sttp.SttpBackend[scala.concurrent.Future, scala.Nothing],
ec : scala.concurrent.ExecutionContext)
extends com.bot4s.telegram.clients.SttpClient[scala.concurrent.Future] {...}
You can create it by yourself as in bot4s examples.
If you will try to find SttpBackends object in bot4s library you would found this code in bot4s examples:
import com.softwaremill.sttp.okhttp._
object SttpBackends {
val default: SttpBackend[Future, Nothing] = OkHttpFutureBackend()
}
add this object to your project to make it compilable.

Apache flink (1.9.1) runtime exception when using case classes in scala (2.12.8)

I am using case class in Scala (2.12.8) Apache Flink (1.9.1) application. I get the following exception when I run the code below Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V.
NOTE: I have used the default constructor as per the suggestion ( java.lang.NoSuchMethodException for init method in Scala case class) but that does not work in my case
Here is the complete code
package com.zignallabs
import org.apache.flink.api.scala._
/**
// Implements the program that reads from a Element list, Transforms it into tuple and outputs to TaskManager
*/
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1) // No help when added default constructor as per https://stackoverflow.com/questions/51129809/java-lang-nosuchmethodexception-for-init-method-in-scala-case-class
}
object WordCount {
def main(args: Array[String]): Unit = {
// set up the execution environment
val env = ExecutionEnvironment.getExecutionEnvironment
// get input data
val input =env.fromElements(" one", "two", "three", "four", "five", "end of test")
// ***** Line 31 throws the exception
// Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
// at com.zignallabs.AddCount.<init>(WordCount.scala:7)
// at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
// at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
// at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
// at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
// at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
// at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
// at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
// at java.lang.Thread.run(Thread.java:748)
val transform = input.map{w => AddCount(w, 1)} // <- Throwing exception
// execute and print result
println(transform)
transform.print()
transform.printOnTaskManager(" Word")
env.execute()
}
}
Run time exception is :
at com.zignallabs.AddCount.<init>(WordCount.scala:7)
at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
I am building and running flink locally using local flink cluster with flink version 1.9.1.
Here is the build.sbt file:
name := "flink191KafkaScala"
version := "0.1-SNAPSHOT"
organization := "com.zignallabs"
scalaVersion := "2.12.8"
val flinkVersion = "1.9.1"
//javacOptions ++= Seq("-source", "1.7", "-target", "1.7")
val http4sVersion = "0.16.6"
resolvers ++= Seq(
"Local Ivy" at "file:///"+Path.userHome+"/.ivy2/local",
"Local Ivy Cache" at "file:///"+Path.userHome+"/.ivy2/cache",
"Local Maven Repository" at "file:///"+Path.userHome+"/.m2/repository",
"Artifactory Cache" at "https://zignal.artifactoryonline.com/zignal/zignal-repos"
)
val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-clients" % "1.9.1",
// Upgrade to flink-connector-kafka_2.11
"org.apache.flink" %% "flink-connector-kafka-0.11" % "1.9.1",
//"org.scalaj" %% "scalaj-http" % "2.4.2",
"com.squareup.okhttp3" % "okhttp" % "4.2.2"
)
publishTo := Some("Artifactory Realm" at "https://zignal.artifactoryonline.com/zignal/zignal")
credentials += Credentials("Artifactory Realm", "zignal.artifactoryonline.com", "buildserver", "buildserver")
//mainClass in Compile := Some("com.zignallabs.StoryCounterTopology")
mainClass in Compile := Some("com.zignallabs.WordCount")
scalacOptions ++= Seq(
"-feature",
"-unchecked",
"-deprecation",
"-language:implicitConversions",
"-Yresolve-term-conflict:package",
"-language:postfixOps",
"-target:jvm-1.8")
lazy val root = project.in(file(".")).configs(IntegrationTest)
If you're using default args for the constructors of a case class, it's much more idiomatic Scala to define them like this:
case class AddCount ( firstP: String = "default", count: Int = 1)
This is syntactic sugar that basically gives you the following for free:
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1)
def this (firstP:String) = this (firstP, 1)
def this (count:Int) = this ("default", count)
}
I am able to now run this application using Scala 2.12. The issue was in the environment. I needed to ensure conflicts binaries are not there especially the ones for scala 2.11 and scala 2.12

ambiguous implicit values: match expected type cats.derived.MkShow[A]: show cats:kittens

I'm trying to create Show Instance for my custom Config class.
The build.sbt file is -
name := "circe-demo"
version := "0.1"
scalaVersion := "2.11.12"
resolvers += Resolver.bintrayRepo("ovotech", "maven")
libraryDependencies += "io.circe" %% "circe-core" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-parser" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-generic" % "0.11.0"
libraryDependencies += "org.typelevel" %% "kittens" % "1.2.0"
libraryDependencies ++= Seq(
"is.cir" %% "ciris-cats",
"is.cir" %% "ciris-cats-effect",
"is.cir" %% "ciris-core",
"is.cir" %% "ciris-enumeratum",
"is.cir" %% "ciris-refined"
).map(_ % "0.12.1")
Complete code is -
import enumeratum.{Enum, EnumEntry}
sealed abstract class AppEnvironment extends EnumEntry
object AppEnvironment extends Enum[AppEnvironment] {
case object Local extends AppEnvironment
case object Testing extends AppEnvironment
case object Production extends AppEnvironment
override val values: Vector[AppEnvironment] =
findValues.toVector
}
import java.net.InetAddress
import scala.concurrent.duration.Duration
final case class ApiConfig(host: InetAddress, port: Int, apiKey: String, timeout: Duration)
import java.net.InetAddress
import cats.Show
import cats.derived.semi
import ciris.config.loader.AppEnvironment.{Local, Production, Testing}
import enumeratum.EnumEntry
import eu.timepit.refined.auto._
import eu.timepit.refined.types.string.NonEmptyString
import scala.concurrent.duration._
final case class Config(appName: NonEmptyString, environment: AppEnvironment, api: ApiConfig)
object Config {
implicit val showConfig: Show[Config] = {
implicit val showDuration: Show[Duration] =
Show.fromToString
implicit val showInetAddress: Show[InetAddress] =
Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] =
Show.show(_.entryName)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
}
semi.show in the above code throws the below exception -
[error] /Users/rajkumar.natarajan/Documents/Coding/kafka_demo/circe-demo/src/main/scala/ciris/config/loader/Config.scala:32:5: ambiguous implicit values:
[error] both value emptyProductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.HNil]
[error] and method emptyCoproductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.CNil]
[error] match expected type cats.derived.MkShow[A]
[error] show
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error]
I'm new to functional programming using cats.
How can I resolve this exception.
Unfortunately error reporting when such complicated implicits and macros are involved is far from perfect. The message you see actually means that some required implicits for the real generator (MkShow.genericDerivedShowProduct in this case) have not been found and the search went back to some where basic stuff where there is an ambiguity. And the stuff that is missing is mostly very basic such as an implicit for Show[Int] or Show[String]. The simplest way to get them all is to import cats.implicits._ but that will also bring catsStdShowForDuration which is a Show[Duration]. But since it's implementation is really the same as your custom one, it is easier to remove your custom one. One more thing that is missing is Show[NonEmptyString] and it is easy to create one
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
To sum up, when I define your showConfig as
implicit val showConfig: Show[Config] = {
import cats.implicits._
// is already defined in cats.implicits._
//implicit val showDuration: Show[Duration] = Show.fromToString
implicit val showInetAddress: Show[InetAddress] = Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] = Show.show(_.entryName)
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
it compiles for me.
P.S. is there any good reason why you put your AppEnvironment under ciris.* package? I'd say that generally putting your custom code into packages of 3-rd party library is an easy way to mess things up.

Cannot Instanciate ZkClient due to invalid ZKStringSerializer reference

I am migrating a Play v 2.3.4 app to Play v 2.5.4. Along the way I had to also upgrade to Scala 2.11.8 and kafka 9.0+ to support the updated Play version.
Most of the issues I have worked out but I cannot figure out a Kafka issue with some code that manages Kafka topics though AdminUtils. The troubles are all centered around kafka.utils.ZkStringSerialzier.
I am using org.I0Itec.zkclient package to instances ZkClient object that is passed in the construction of ZkUtils object but it fails because it cannot resolve my ZkStringSerializer.
Related code is:
import kafka.admin.AdminUtils
import kafka.utils.ZkUtils
import kafka.utils.ZKStringSerializer
import org.I0Itec.zkclient.{ZkClient, ZkConnection}
object Topic {
def CreateKafkaTopic(topic: String, zookeeperHosts: String, partitionSize: Int, replicationCount: Int, connectionTimeoutMs: Int = 10000, sessionTimeoutMs: Int = 10000): Boolean = {
var zkSerializer: ZKStringSerializer = ZKStringSerializer
val zkClient: ZkClient= new ZkClient(zookeeperHosts, connectionTimeoutMs, sessionTimeoutMs, zkSerializer)
val topicConfig: Properties = new Properties()
val isSecureKafkaCluster: Boolean = false
val zkUtils: ZkUtils = new ZkUtils(zkClient, new ZkConnection(zookeeperHosts), isSecureKafkaCluster)
AdminUtils.createTopic(zkUtils, topic, partitionSize, replicationCount, topicConfig)
zkClient.close()
}
}
The above code results in the error that ZKStringSerializer is inaccessible from his place.
I found several related post to creating topics (mostly in Java and before Kafka 9.0)
Creating a topic for Apache Kafka 0.9 Using Java
How create Kafka ZKStringSerializer in Java?
How Can we create a topic in Kafka from the IDE using API
And Finally
Creating a Kafka topic results in no leader
Based on these I updated by code as follows:
import kafka.admin.AdminUtils
import kafka.utils.ZkUtils
import kafka.utils.ZKStringSerializer$
import org.I0Itec.zkclient.{ZkClient, ZkConnection}
object Topic {
def CreateKafkaTopic(topic: String, zookeeperHosts: String, partitionSize: Int, replicationCount: Int, connectionTimeoutMs: Int = 10000, sessionTimeoutMs: Int = 10000): Boolean = {
var zkSerializer: ZKStringSerializer = ZKStringSerializer$.MODULE$
val zkClient: ZkClient= new ZkClient(zookeeperHosts, connectionTimeoutMs, sessionTimeoutMs, zkSerializer)
val topicConfig: Properties = new Properties()
val isSecureKafkaCluster: Boolean = false
val zkUtils: ZkUtils = new ZkUtils(zkClient, new ZkConnection(zookeeperHosts), isSecureKafkaCluster)
AdminUtils.createTopic(zkUtils, topic, partitionSize, replicationCount, topicConfig)
zkClient.close()
}
}
And then I just get unable to resolve symbol ZkStringSerialzer$ errors.
I tried both with the org.I0Itec.zkclient.serialize.ZkSerializer object as well and it did not make a difference.
So my Question is actually two fold:
1. What is the significance of the '$' character for the Import and Declarations statements in scala. I have used it in string interpolation ( e/g/ s"var value is $var")to reference variables but this seems different.
2. What is wrong with my code. Is it the way I am importing, declaring, something else?
I am new to scala and Play but I am feeling like quite and idiot at the moment so any advice / help is appreciated
~Dave
P.S.
In case it helps relevant bits from project files
build.sbt:
lazy val `api` = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.kafka" % "kafka_2.11" % "0.9.0.1",
jdbc,
cache,
ws,
specs2 % Test
)
plugins.sbt:
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.4")
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.0.0")
build.properties:
sbt.version=0.13.5
After fighting this issue over the weekend I gave up on the ZKClient package that had been used previously and simple used Kafka directly which was actually much cleaner that trying to use the I0Itech ZKClient.
New implementation goes like this:
import java.util.Properties
import kafka.admin.AdminUtils
import kafka.utils.ZkUtils
class Topic {
def CreateKafkaTopic(topic: String, zookeeperHosts: String, partitionSize: Int, replicationCount: Int, connectionTimeoutMs: Int = 10000, sessionTimeoutMs: Int = 10000): Boolean = {
if (ListKafkaTopics(zookeeperHosts).contains(topic) ) {
return false
}
val zkUtils = ZkUtils.apply(zookeeperHosts, sessionTimeoutMs, connectionTimeoutMs, false)
AdminUtils.createTopic( zkUtils, topic, partitionSize, replicationCount, new Properties())
zkUtils.close()
true
}
}
End the end removed a dependency and make cleaner code so a double win I suppose.
~Dave
The reason for this problem is the ZkStringSerialzer is declared as private, just use ZkUtils.createZkClient instead as follows:
ZkUtils.createZkClient(zookeeperHosts, sessionTimeoutMs, connectionTimeoutMs)

Scala enumeration serialization in jersey/jackson is not working for me

I've read the jackson-module-scala page on enumeration handling (https://github.com/FasterXML/jackson-module-scala/wiki/Enumerations). Still I'm not getting it to work. The essential code goes like this:
#Path("/v1/admin")
#Produces(Array(MediaType.APPLICATION_JSON + ";charset=utf-8"))
#Consumes(Array(MediaType.APPLICATION_JSON + ";charset=utf-8"))
class RestService {
#POST
#Path("{type}/abort")
def abortUpload(#PathParam("type") typeName: ResourceTypeHolder) {
...
}
}
object ResourceType extends Enumeration {
type ResourceType = Value
val ssr, roadsegments, tmc, gab, tne = Value
}
class ResourceTypeType extends TypeReference[ResourceType.type]
case class ResourceTypeHolder(
#JsonScalaEnumeration(classOf[ResourceTypeType])
resourceType:ResourceType.ResourceType
)
This is how it's supposed to work, right? Still I get these errors:
Following issues have been detected:
WARNING: No injection source found for a parameter of type public void no.tull.RestService.abortUpload(no.tull.ResourceTypeHolder) at index 0.
unavailable
org.glassfish.jersey.server.model.ModelValidationException: Validation of the application resource model has failed during application initialization.
[[FATAL] No injection source found for a parameter of type public void no.tull.RestService.abortUpload(no.tull.ResourceTypeHolder) at index 0.; source='ResourceMethod{httpMethod=POST, consumedTypes=[application/json; charset=utf-8], producedTypes=[application/json; charset=utf-8], suspended=false, suspendTimeout=0, suspendTimeoutUnit=MILLISECONDS, invocable=Invocable{handler=ClassBasedMethodHandler{handlerClass=class no.tull.RestService, handlerConstructors=[org.glassfish.jersey.server.model.HandlerConstructor#7ffe609f]}, definitionMethod=public void no.tull.RestService.abortUpload(no.tull.ResourceTypeHolder), parameters=[Parameter [type=class no.tull.ResourceTypeHolder, source=type, defaultValue=null]], responseType=void}, nameBindings=[]}']
at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:467)
at org.glassfish.jersey.server.ApplicationHandler.access$500(ApplicationHandler.java:163)
at org.glassfish.jersey.server.ApplicationHandler$3.run(ApplicationHandler.java:323)
at org.glassfish.jersey.internal.Errors$2.call(Errors.java:289)
at org.glassfish.jersey.internal.Errors$2.call(Errors.java:286)
I have also assembled a tiny runnable project (while trying to eliminate any other complications) that demonstrates the problem: project.tgz
Update: Created an sbt-file to see if gradle was building a strange build. Got the same result, but this is the build.sbt:
name := "project"
version := "1.0"
scalaVersion := "2.10.4"
val jacksonVersion = "2.4.1"
val jerseyVersion = "2.13"
libraryDependencies ++= Seq(
"com.fasterxml.jackson.core" % "jackson-annotations" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,
"com.fasterxml.jackson.jaxrs" % "jackson-jaxrs-json-provider" % jacksonVersion,
"com.fasterxml.jackson.jaxrs" % "jackson-jaxrs-base" % jacksonVersion,
"com.fasterxml.jackson.module" % "jackson-module-scala_2.10" % jacksonVersion,
"org.glassfish.jersey.containers" % "jersey-container-servlet-core" % jerseyVersion
)
seq(webSettings :_*)
libraryDependencies ++= Seq(
"org.eclipse.jetty" % "jetty-webapp" % "9.1.0.v20131115" % "container",
"org.eclipse.jetty" % "jetty-plus" % "9.1.0.v20131115" % "container"
)
... and this is the project/plugins.sbt:
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.9.0")
You seem to possibly have a few problems with your tarball.
You need to add some Scala modules to Jackson to be able to use any Scala functionality. That can be done by doing this:
val jsonObjectMapper = new ObjectMapper()
jsonObjectMapper.registerModule(DefaultScalaModule)
val jsonProvider: JacksonJsonProvider = new JacksonJsonProvider(jsonObjectMapper)
According to this working jersey-jackson example. You also need to inject org.glassfish.jersey.jackson.JacksonFeature into Jersey which is found in jersey-media-json-jackson. My RestApplication.scala came out like this
import javax.ws.rs.core.Application
import javax.ws.rs.ext.{ContextResolver, Provider}
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.google.common.collect.ImmutableSet
import org.glassfish.jersey.jackson.JacksonFeature
#Provider
class ObjectMapperProvider extends ContextResolver[ObjectMapper] {
val defaultObjectMapper = {
val jsonObjectMapper = new ObjectMapper()
jsonObjectMapper.registerModule(DefaultScalaModule)
jsonObjectMapper
}
override def getContext(typ: Class[_]): ObjectMapper = {
defaultObjectMapper
}
}
class RestApplication extends Application {
override def getSingletons: java.util.Set[AnyRef] = {
ImmutableSet.of(
new RestService,
new ObjectMapperProvider,
new JacksonFeature
)
}
}
The real issue, though, is the #PathParam annotation. This code path doesn't invoke Jackson at all. However, what's interesting is that Jersey appears to generically support parsing to any type that has a constructor of a single string. So if you modify your ResourceTypeHolder you can get the functionality you want after all.
case class ResourceTypeHolder(#JsonScalaEnumeration(classOf[ResourceTypeType]) resourceType:ResourceType.ResourceType) {
def this(name: String) = this(ResourceType.withName(name))
}
You might be able to add generic support for enum holders to Jersey as an injectable provider. However, that hasn't come up in dropwizard-scala, a project that would suffer the same fate as it uses Jersey too. Thus I imagine it's either impossible, or simply just not common enough for anyone to have done the work. When it comes to enum's, I tend to keep mine in Java.