I'm trying to run a basic app from the following example:
https://github.com/confluentinc/examples/blob/3.3.x/kafka-streams/src/main/scala/io/confluent/examples/streams/MapFunctionScalaExample.scala
However I'm getting an exception at this line:
// Variant 1: using `mapValues`
val uppercasedWithMapValues: KStream[Array[Byte], String] = textLines.mapValues(_.toUpperCase())
Error:(33, 25) missing parameter type for expanded function ((x$1) =>
x$1.toUpperCase())
textLines.mapValues(_.toUpperCase())
Error I'm getting if I hover cursor over the code:
Type mismatch, expected: ValueMapper[_ >: String, _ <: NotInferedVR],
actual: (Any) => Any Cannot resolve symbol toUpperCase
Contents of my sbt file:
name := "untitled1"
version := "0.1"
scalaVersion := "2.11.11"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.11
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-streams
libraryDependencies += "org.apache.kafka" % "kafka-streams" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/connect-api
libraryDependencies += "org.apache.kafka" % "connect-api" % "0.11.0.0"
I'm really not sure how to proceed with that as I'm quite new to Scala. I'd like to know what's the issue and how to fix it.
From http://docs.confluent.io/current/streams/faq.html#scala-compile-error-no-type-parameter-java-defined-trait-is-invariant-in-type-t
The root cause of this problem is Scala-Java interoperability – the Kafka Streams API is implemented in Java, but your application is written in Scala. Notably, this problem is caused by how the type systems of Java and Scala interact. Generic wildcards in Java, for example, are often causing such Scala issues.
To fix the problem you would need to declare types explicitly in your Scala application in order for the code to compile. For example, you may need to break a single statement that chains multiple DSL operations into multiple statements, where each statement explicitly declares the respective return types. The StreamToTableJoinScalaIntegrationTest demonstrates how the types of return variables are explicitly declared.
Update
Kafka 2.0 (will be released in June) contains a proper Scala API that avoid those issues. Compare https://cwiki.apache.org/confluence/display/KAFKA/KIP-270+-+A+Scala+Wrapper+Library+for+Kafka+Streams
Related
From my understanding I should be able to use the banana-rdf library in my scalajs code? I followed the instructions on the website and added the following to my build.sbt:
val banana = (name: String) => "org.w3" %% name % "0.8.4" excludeAll (ExclusionRule(organization = "org.scala-stm"))
and added the following to my common settings:
resolvers += "bblfish-snapshots" at "http://bblfish.net/work/repo/releases"
libraryDependencies ++= Seq("banana", "banana-rdf", "banana-sesame").map(banana)
It all compiles fine until it gets to the point where it does the fast optimizing. Then I get the following error:
Referring to non-existent class org.w3.banana.sesame.Sesame$
I tried changing Seasame for Plantain but got the same outcome.
Am I missing something?
I was using the "%%" notation which is for a jvm module.
Changed it to "%%%" and it was able to find the correct library.
NOTE. I had to use Plantain as this is the only one currently compiled for scalajs
I am doing some tests, and in many cases I have a configuration of an FTP / HTTP.
I am working with Scala and the following libraries in my sbt:
"org.scalatest" %% "scalatest" % "3.0.1" % Test,
"org.scalamock" %% "scalamock" % "4.1.0" % Test,
I am doing for the following code as an example of a configuration mocked, inside of my test:
val someConfig = SomeConfig(
endpoint = "",
user = "",
password = "",
companyName="",
proxy = ProxyConfig("", 2323)
)
But I feel it is not nice to do this for each configuration that I am going to be dealing with...
I would like to create the following:
val someConfig = mock[SomeConfig]
but when my code tries to reach the proxy property, which is a case class, it fails with a null pointer exception.
I would like to know how to mock case classes that contains other case classes and make my code a bit more clear, is there a way to do this with MockFactory?
You can try to mock it like this:
val someConfig = mock[SomeConfig]
when(someConfig.proxy).thenReturn(ProxyConfig("", 2323))
So it will return ProxyConfig("", 2323) when you try to get someConfig.proxy.
The above code is using Mockito due to a known limitation of ScalaMock
Parameters of case classes are translated into val fields, and ScalaMock has a known limitation where it is not able to mock val, so I think it is not possible to directly do this with ScalaMock.
Mockito does have this capability.
I've got a simple code in Scala to try simulacrum lib:
import simulacrum._
#typeclass trait Semigroup[A] {
#op("|+|") def append(x: A, y: A): A
}
But this doesn't work. Compiler says
Error:(3, 2) macro implementation not found: macroTransform (the most
common reason for that is that you cannot use macro implementations in
the same compilation run that defines them) #typeclass trait
Semigroup[A] {
What can cause this error?
I do not create a macro, I just reuse an existing one.
My build.sbt file is simple:
name := "Macr"
version := "0.1"
scalaVersion := "2.12.5"
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
libraryDependencies += "com.github.mpilquist" %% "simulacrum" % "0.12.0"
As noted by Oleg Pyzhcov in the comments, macros don't work with Scala 2.12.4 and 2.12.5 when compiling on Java 9 or 10. However, this has been fixed in Scala 2.12.6, so upgrading should solve the problem.
I am trying to learn scala. In the Squeryl documentation I have come across this sign ++=. What does this sign mean? The code was -
libraryDependencies ++= Seq(
"org.squeryl" %% "squeryl" % "0.9.5-6",
yourDatabaseDependency
)
This is simply a method that appends a Seq of dependencies to the libraryDependencies setting. As compared to +=, which appends a single dependency (opposed to a Seq).
For more info, you might want to check out the sbt docs
This isn't part of Scala itself; it's a method in SBT.
libraryKeys is a SettingKey[Seq[ModuleID]], so take a glance at the API doc for SettingKey.
++= is one of the methods on SettingKey. Its return type is Setting.
As a general convention in Scala collections, ++= method takes a collection (right hand side) and puts it into "this" collection (left hand side). SBT uses collections for dependency lists and they are no exception.
Question says it all.
(Yet, the details of how to get access to the shift and reset operations has changed over the years. Old blog entries and Stack Overflow answers may have out of date information.)
See also What are Scala continuations and why use them? which talks about what you might want to do with shift and reset once you have them.
Scala 2.11
The easiest way is to use sbt:
scalaVersion := "2.11.6"
autoCompilerPlugins := true
addCompilerPlugin(
"org.scala-lang.plugins" % "scala-continuations-plugin_2.11.6" % "1.0.2")
libraryDependencies +=
"org.scala-lang.plugins" %% "scala-continuations-library" % "1.0.2"
scalacOptions += "-P:continuations:enable"
In your code (or the REPL), do import scala.util.continuations._
You can now use shift and reset to your heart's content.
historical information for Scala 2.8, 2.9, 2.10
You have to start scala (or scalac) with the -P:continuations:enable flag.
In your code, do import scala.util.continuations._
You can now use shift and reset to your heart's content.
If you're using sbt 0.7, see https://groups.google.com/forum/#!topic/simple-build-tool/Uj-7zl9n3f4
If you're using sbt 0.11+, see https://gist.github.com/1302944
If you're using maven, see http://scala-programming-language.1934581.n4.nabble.com/scala-using-continuations-plugin-with-2-8-0-RC1-and-maven-td2065949.html#a2065949
Non SBT solution:
scala -Xpluginsdir /.../scala/lib/ -P:continuations:enable
Works on scala 2.11.6, but the plugin/library said that it will no longer be included with Scala 2.12