I'm trying to use kafka streams to process some data from a kafka topic. The data comes from kafka topic that was written to by kafka 0.11.0 something which doesn't have the embedded timestamp. After some reading on the internet, I came to understand that I can solve this problem by extending TimestampExtractor class in a custom class and passing it in the StreamsConfig.
I did so like this --
class MyEventTimestampExtractor extends TimestampExtractor {
override def extract(record: ConsumerRecord[AnyRef, AnyRef], prev: Long) = {
record.value() match {
case w: String => 1000L
case _ => throw new RuntimeException(s"Called for $record")
}
}
}
I based it off of this code on github
But, I get this error when I do an sbt run
[error] /home/someuser/app/blahblah/src/main/scala/main.scala:34: class MyEventTimestampExtractor needs to be abstract, since method extract in trait TimestampExtractor of type (x$1: org.apache.kafka.clients.consumer.ConsumerRecord[Object,Object], x$2: Long)Long is not defined
[error] (Note that Long does not match Long)
[error] class MyEventTimestampExtractor extends TimestampExtractor {
[error] ^
[error] /home/someuser/app/blahblah/src/main/scala/main.scala:35: method extract overrides nothing.
[error] Note: the super classes of class MyEventTimestampExtractor contain the following, non final members named extract:
[error] def extract(x$1: org.apache.kafka.clients.consumer.ConsumerRecord[Object,Object],x$2: Long): Long
[error] override def extract(record: ConsumerRecord[AnyRef, AnyRef], prev: Long): Long = {
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
My build.sbt file is this --
name := "kafka streams experiment"
version := "1.0"
scalaVersion := "2.12.4"
libraryDependencies ++= Seq(
"org.apache.kafka" % "kafka-streams" % "1.0.0"
)
I don't really understand the error. Specifically the part around Note that Long does not match Long. What could I be doing wrong? thanks!
You need java.Long since this API is defined in Java whereas you are using Scala Long
Try with (watch the type of the prev function argument:
override def extract(record: ConsumerRecord[AnyRef, AnyRef], prev: java.lang.Long) = {
Related
I've defined a case class to be used as a schema for a Dataset in Spark.
I want to be able to refer to individual columns from that schema by referencing them programmatically (vs. hardcoding their string value somewhere)
For example, for the following case class
final case class MySchema(id: Int, name: String, timestamp: Long)
I would like to auto-generate the following object
object MySchema {
val id = "id"
val name = "name"
val timestamp = "timestamp"
}
The Macro approach outlined here appears to be what I want, but it won't compile under Scala 2.12. It gives the following errors which are completely baffling to me and show up in a total of 2 Google results with 0 fixes.
[error] pattern var qq$macro$2 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$2#_`
[error] case (c#q"$_ class $tpname[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") :: Nil =>
[error] ^
[error] pattern var qq$macro$19 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$19#_`
[error] case (c#q"$_ class $_[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") ::
[error] ^
[error] pattern var qq$macro$27 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$27#_`
[error] q"$mods object $tname extends { ..$earlydefns } with ..$parents { $self => ..$body }" :: Nil =>
[error] ^
Suppressing the warning as outlined won't work because the macro numbers change every time I compile.
It's also worth noting that the similar SO answer here runs into the same compiler errors as shown above
IntelliJ also complains about several parts of the macro that the compiler doesn't complain about, but that's not really an issue if I can get it to compile
Is there a way to fix that Macro approach to work in Scala 2.12 or is there a better Scala 2.12 way to do this? (I can't use Scala 2.13 or higher due to compute environment constraints)
Just checked that the macro is still working both in Scala 2.13.10 and 2.12.17.
Most probably, you didn't set up your project for macro annotations
build.sbt
//ThisBuild / scalaVersion := "2.13.10"
ThisBuild / scalaVersion := "2.12.17"
lazy val macroAnnotationSettings = Seq(
scalacOptions ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v >= 13 => Seq("-Ymacro-annotations") // for Scala 2.13
case _ => Nil
}),
libraryDependencies ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v <= 12 => // for Scala 2.12
Seq(compilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full))
case _ => Nil
})
)
lazy val core = project
.settings(
macroAnnotationSettings,
scalacOptions ++= Seq(
"-Ymacro-debug-lite", // optional, convenient to see how macros are expanded
),
)
.dependsOn(macros) // you must split your project into subprojects because macros must be compiled before core
lazy val macros = project
.settings(
macroAnnotationSettings,
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-reflect" % scalaVersion.value, // necessary for macros
),
)
project structure:
core
src
main
scala
Main.scala
macros
src
main
scala
Macros.scala
Then just do sbt clean compile.
The whole project: https://gist.github.com/DmytroMitin/2d9dbd6441ebf167aa127b80fb516afd
sbt documentation:
https://www.scala-sbt.org/1.x/docs/Macro-Projects.html
Scala documentation: https://docs.scala-lang.org/overviews/macros/annotations.html
Examples of build.sbt:
https://github.com/typelevel/simulacrum/blob/master/build.sbt
https://github.com/DmytroMitin/AUXify/blob/master/build.sbt
I am reading this tutorial on tagless final.
Based on this I have defined my dependencies as
object Dependencies {
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.5"
lazy val cats = "org.typelevel" %% "cats-core" % "1.2.0"
lazy val monix = "io.monix" %% "monix" % "2.3.3"
lazy val monixCats = "io.monix" %% "monix-cats" % "2.3.3"
}
The following is my code
// future
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent._
import scala.concurrent.duration._
// cats
import cats.Monad
import cats.implicits._
// monix
import monix.eval.Task
import monix.cats._
import monix.cats.reverse._
trait ProductRepository[M[_]] {
def findProduct(productId: ProductId) : M[Option[Product]]
def saveProduct(product: Product) : M[Unit]
def incrementProductSales(productId: ProductId, quantity: Long) : M[Unit]
}
class ProductRepositoryWithFuture extends ProductRepository[Future] {
def findProduct(productId: ProductId) : Future[Option[Product]] = {
Future.successful(Some(Product(productId, "foo")))
}
def saveProduct(product: Product) : Future[Unit] = {
Future.successful()
}
def incrementProductSales(productId: ProductId, quanity: Long) : Future[Unit] = {
Future.successful()
}
}
class ProductRepositoryWithTask extends ProductRepository[Task] {
def findProduct(productId: ProductId) : Task[Option[Product]] = {
Task.now(Some(Product(productId, "foo")))
}
def saveProduct(product: Product) : Task[Unit] = {
Task.unit
}
def incrementProductSales(productId: ProductId, quantity: Long) : Task[Unit] = {
Task.unit
}
}
But I get bunch of errors. It seems that the version of cats which I am using is not compatible with the one Monix uses.
I also tried to remove my cats dependency and just imported monix so that monix pulls in its own version of cats. but even that doesn't compile.
error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:54:24: Symbol 'type cats.MonadFilter' is missing fromthe classpath.
[error] This symbol is required by 'method monix.cats.MonixToCatsCore7.monixToCatsMonadFilter'.
[error] Make sure that type MonadFilter is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MonixToCatsCore7.class' was compiled against an incompatible version of cats.
[error] repo.findProduct(id).flatMap{
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:54:23: diverging implicit expansion for type monix.types.Comonad[M]
[error] starting with method catsToMonixComonad in trait CatsCoreToMonix5
[error] repo.findProduct(id).flatMap{
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:54:28: value flatMap is not a member of type parameter M[Option[example.Application.Product]]
[error] repo.findProduct(id).flatMap{
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:56:30: value copy is not a member of Any
[error] val newProduct = p.copy(name = name)
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:56:40: reassignment to val
[error] val newProduct = p.copy(name = name)
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:57:27: diverging implicit expansion for type monix.types.MonadError[M,E]
[error] starting with method catsToMonixMonadError in trait CatsCoreToMonix3
[error] repo.saveProduct(newProduct).map(_ => Some(p))
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:57:40: value map is not a member of type parameter M[Unit]
[error] repo.saveProduct(newProduct).map(_ => Some(p))
[error] ^
[error] /Users/foobar/code/tagless/src/main/scala/example/Hello.scala:59:16: diverging implicit expansion for type cats.Comonad[M]
[error] starting with method monixToCatsComonad in trait MonixToCatsCore5
[error] Monad[M].pure(None)
[error] ^
[error] 8 errors found
The errors are caused by incompatibilities between your dependencies.
For example monix 2.3.3 depends on cats 0.9.0 while you're trying to use 1.2.0 which is binary incompatible.
You should try either upgrading monix to 3.x or downgrading cats to 0.9.0.
P.S. The transition from cats 0.9.0 to 1.x has a lot of breaking changes and you have to make sure that all libraries you're using are compiled against the same (or at least binary compatible) version of cats.
I'm using the scala driver to make IO operations with mongodb. My scala version is 2.11.11 and the mongo db driver is 2.2.0.
I take the example in documentation about ADT :
sealed class Tree
case class Branch(b1: Tree, b2: Tree, value: Int) extends Tree
case class Leaf(value: Int) extends Tree
val codecRegistry = fromRegistries( fromProviders(classOf[Tree]), DEFAULT_CODEC_REGISTRY )
This code didn't compile.
No known subclasses of the sealed class
[error] val codecRegistry = fromRegistries( fromProviders(classOf[Tree]), DEFAULT_CODEC_REGISTRY )
[error] ^
[error] knownDirectSubclasses of Tree observed before subclass Branch registered
[error] knownDirectSubclasses of Tree observed before subclass Leaf registered
Did I miss something ?
Update
Below a complete example of what I'm tring to do.
build.sbt
name := "mongodb-driver-test"
version := "1.0"
scalaVersion := "2.11.11"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.2.0"
file Models.scala
import org.mongodb.scala.bson.codecs.{DEFAULT_CODEC_REGISTRY, Macros}
import org.bson.codecs.configuration.CodecRegistries.{fromProviders, fromRegistries}
/**
* Created by alifirat on 02/01/18.
*/
object Models {
sealed class Tree
case class Branch(b1: Tree, b2: Tree, value: Int) extends Tree
case class Leaf(value: Int) extends Tree
val treeCodec = Macros.createCodecProvider[Tree]()
val treeCodecRegistry = fromRegistries( fromProviders(treeCodec), DEFAULT_CODEC_REGISTRY )
}
Then, just do :
sbt compile
You will get :
[error] val treeCodec = Macros.createCodecProvider[Tree]()
[error] ^
[error] knownDirectSubclasses of Tree observed before subclass Branch registered
[error] knownDirectSubclasses of Tree observed before subclass Leaf registered
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
If I change the scala version to 2.12.0, I didn't have any errors at compile time ...
I'm using driver version 2.6.0 and Scala version 2.12.8 and still get the same problem.
My workaround is to remove the keyword sealed in front of that sealed class, compile, put it back, and then compile again. But it's very cumbersome.
I'm running into problems in my open-source project using Macros to generate some code. Everything works fine if I use c.untypecheck, but ideally I'd prefer not to have to do that.
This is the relevant code: https://github.com/outr/reactify/blob/master/shared/src/main/scala/com/outr/reactify/Macros.scala#L46
If I remove the c.untypecheck I get the following compile-time error:
[error] (reactifyJVM/test:compileIncremental) java.lang.AssertionError: assertion failed:
[error] transformCaseApply: name = previousVal tree = previousVal / class scala.reflect.internal.Trees$Ident
[error] while compiling: /home/mhicks/projects/open-source/reactify/shared/src/test/scala/specs/BasicSpec.scala
[error] during phase: refchecks
[error] library version: version 2.12.1
[error] compiler version: version 2.12.1
[error] reconstructed args: -classpath /home/mhicks/projects/open-source/reactify/jvm/target/scala-2.12/test-classes:/home/mhicks/projects/open-source/reactify/jvm/target/scala-2.12/classes:/home/mhicks/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.12.1.jar:/home/mhicks/.ivy2/cache/org.scalatest/scalatest_2.12/bundles/scalatest_2.12-3.0.1.jar:/home/mhicks/.ivy2/cache/org.scalactic/scalactic_2.12/bundles/scalactic_2.12-3.0.1.jar:/home/mhicks/.ivy2/cache/org.scala-lang.modules/scala-xml_2.12/bundles/scala-xml_2.12-1.0.5.jar:/home/mhicks/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.12/bundles/scala-parser-combinators_2.12-1.0.4.jar -bootclasspath /usr/java/jdk1.8.0_92/jre/lib/resources.jar:/usr/java/jdk1.8.0_92/jre/lib/rt.jar:/usr/java/jdk1.8.0_92/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_92/jre/lib/jsse.jar:/usr/java/jdk1.8.0_92/jre/lib/jce.jar:/usr/java/jdk1.8.0_92/jre/lib/charsets.jar:/usr/java/jdk1.8.0_92/jre/lib/jfr.jar:/usr/java/jdk1.8.0_92/jre/classes:/home/mhicks/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.1.jar
[error]
[error] last tree to typer: TypeTree(class Position)
[error] tree position: line 148 of /home/mhicks/projects/open-source/reactify/shared/src/test/scala/specs/BasicSpec.scala
[error] tree tpe: org.scalactic.source.Position
[error] symbol: case class Position in package source
[error] symbol definition: case class Position extends Product with Serializable (a ClassSymbol)
[error] symbol package: org.scalactic.source
[error] symbol owners: class Position
[error] call site: <$anon: com.outr.reactify.ChangeListener[Int]> in package specs
[error]
[error] == Source file context for tree position ==
[error]
[error] 145 current should be(15)
[error] 146 }
[error] 147 "observe a complex change" in {
[error] 148 val v1 = Var(5)
[error] 149 val v2 = Var(10)
[error] 150 val v3 = Var(v1 + v2)
[error] 151 var changed = 0
[error] Total time: 1 s, completed Jan 31, 2017 4:43:03 PM
If I add it back everything compiles and works just fine. In more complex use-cases I've been encountering some issues at compile-time Could not find proxy for ... and I think this might be the reason.
Any suggestions would be greatly appreciated.
You're introducing an untyped tree into a typed tree.
The incoming tree is typechecked, and then the outgoing tree (that your macro emits) is typechecked again, but the typer does not descend into a tree that is already typechecked (i.e., that has a type already assigned to it).
Because you're introducing new symbols, you can't just use the incoming context to typecheck your reference.
So, the simplest solution is what you arrived at, to untypecheck the outgoing tree. It's also sufficient to untypecheck the transformed tree, to allow typer to descend to your new, untyped tree.
I had to reduce the exploding test by commenting out code. It's unfortunate that it's not immediately obvious what source line causes the error. Maybe it's more obvious if you're familiar with the macro involved.
class Sample {
def sample(): Unit = {
val v = Var(5)
v := v + 5
}
}
The tree in question, from -Xprint:typer -Yshow-trees:
Apply( // def +(x: Int): Int in class Int, tree.tpe=Int
com.outr.reactify.`package`.state2Value[Int](previousVal)."$plus" // def +(x: Int): Int in class Int, tree.tpe=(x: Int)Int
5
)
Also worth mentioning that it was easier to write a quick compile script with the "reconstructed args" in the error message, to eliminate sbt incremental compilation, ScalaTest macros and other mysteries.
Edit, the API for setting by hand:
def setStateChannel(value: c.Tree): c.Tree = {
val observables = retrieveObservables(c)(value)
val channel = c.prefix.tree
val selfReference = observables.exists(_.equalsStructure(channel))
val untyped =
q"""
val previousValue = com.outr.reactify.State.internalFunction($channel)
val previousVal = com.outr.reactify.Val(previousValue())
"""
val retyped = c.typecheck(untyped)
val transformed = if (selfReference) {
val transformer = new Transformer {
override def transform(tree: c.universe.Tree): c.universe.Tree = if (tree.equalsStructure(channel)) {
val t = q"previousVal"
val Block(_ :: v :: Nil, _) = retyped
c.internal.setSymbol(t, v.symbol)
c.internal.setType(t, v.tpe)
} else {
super.transform(tree)
}
}
transformer.transform(value)
} else {
value
}
val res = q"$channel.update(List(..$observables), $transformed)"
q"$retyped ; $res"
}
I'm trying to write a simple custom matcher for NodeSeq, with scalatest v.2.0.M5b.
package test
import org.scalatest.matchers.{MatchResult, Matcher, ShouldMatchers}
import scala.xml.NodeSeq
import org.scalatest.FunSpec
class MySpec extends FunSpec with ShouldMatchers with MyMatcher {
describe("where is wrong?") {
it("showOK") {
val xml = <span>abc</span>
xml should contn("b")
}
}
}
trait MyMatcher {
class XmlMatcher(str: String) extends Matcher[NodeSeq] {
def apply(xml: NodeSeq) = {
val x = xml.toString.contains(str)
MatchResult(
x,
"aaa",
"bbb"
)
}
}
def contn(str: String) = new XmlMatcher(str)
}
When I compile it, it reports error:
[error] /Users/freewind/src/test/scala/test/MyMacher.scala:14: overloaded method value should with alternatives:
[error] (beWord: MySpec.this.BeWord)MySpec.this.ResultOfBeWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (notWord: MySpec.this.NotWord)MySpec.this.ResultOfNotWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (haveWord: MySpec.this.HaveWord)MySpec.this.ResultOfHaveWordForSeq[scala.xml.Node] <and>
[error] (rightMatcher: org.scalatest.matchers.Matcher[scala.collection.GenSeq[scala.xml.Node]])Unit
[error] cannot be applied to (MySpec.this.XmlMatcher)
[error] xml should contn("b")
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
Where is wrong?
Update:
The build.sbt file I use:
name := "scalatest-test"
scalaVersion := "2.10.1"
version := "1.0"
resolvers ++= Seq("snapshots" at "http://oss.sonatype.org/content/repositories/snapshots",
"releases" at "http://oss.sonatype.org/content/repositories/releases",
"googlecode" at "http://sass-java.googlecode.com/svn/repo"
)
libraryDependencies += "org.scalatest" %% "scalatest" % "2.0.M5b" % "test"
And a demo project: https://github.com/freewind/scalatest-test
For the reason why scala compiler complains see this answer.
But it seems that ScalaTest API has changed quite a bit since then, so the two solutions proposed both need some modification (tested for ScalaTest 2.0.M5b):
Replace All instances of NodeSeq to GenSeq[Node] so that the type matches everywhere.
See SeqShouldWrapper class of ScalaTest.
Alternatively, wrap xml explicitely with the conversion function, i.e. manually setting the required type but I don't recommend this because it makes client code ugly.
new AnyRefShouldWrapper(xml).should(contn("b"))
BTW, it is good to have a small but complete project on github for others to tweak. It makes this question much more attractive.