Simulacrum: macro implementation not found - scala

I've got a simple code in Scala to try simulacrum lib:
import simulacrum._
#typeclass trait Semigroup[A] {
#op("|+|") def append(x: A, y: A): A
}
But this doesn't work. Compiler says
Error:(3, 2) macro implementation not found: macroTransform (the most
common reason for that is that you cannot use macro implementations in
the same compilation run that defines them) #typeclass trait
Semigroup[A] {
What can cause this error?
I do not create a macro, I just reuse an existing one.
My build.sbt file is simple:
name := "Macr"
version := "0.1"
scalaVersion := "2.12.5"
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
libraryDependencies += "com.github.mpilquist" %% "simulacrum" % "0.12.0"

As noted by Oleg Pyzhcov in the comments, macros don't work with Scala 2.12.4 and 2.12.5 when compiling on Java 9 or 10. However, this has been fixed in Scala 2.12.6, so upgrading should solve the problem.

Related

Scala Meta: Confused about the versions

In the tutorial you find 2 versions for Scala-Meta.
lazy val MetaVersion = "3.7.2"
lazy val MetaVersion1 = "1.8.0"
I am a bit confused as they seem to refer the same project:
lazy val scalameta1 = "org.scalameta" %% "scalameta" % MetaVersion1
lazy val scalameta = "org.scalameta" %% "scalameta" % MetaVersion
Can somebody point out the difference, and when you use which one of these?
The Tutorial only mentions "3.7.2", but with that I got the exception
ERROR: new-style ("inline") macros require scala.meta
explained here: new-style-inline-macros-require-scala-meta
3.7.2 is the current version of scalameta (actually already 3.7.4).
1.8.0 is the last version of scalameta that worked with scalameta macro annotations through scalameta paradise compiler plugin (1 2 3).
So if you need the latest version of scalameta you use 3.7.4. If you need scalameta macros you use 1.8.0.

Why adding import `import cats.instances.future._` will result an compilation error for implicit Functor[Future]

The scala code is using cats and works well:
import cats.implicits._
import cats.Functor
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.Future
object Hello extends App {
Functor[Future].map(Future("hello"))(_ + "!")
}
But if I add this import:
import cats.instances.future._
It will report such compilation errors:
Error:(18, 10) could not find implicit value for parameter instance: cats.Functor[scala.concurrent.Future]
Functor[Future].map(Future("hello"))(_ + "!")
Why it happens, and how can I debug it to find reason? I used all kinds of ways I know, but can't find anything.
The build.sbt file is:
name := "Cats Implicit Functor of Future Compliation Error Demo"
version := "0.1"
organization := "org.my"
scalaVersion := "2.12.4"
sbtVersion := "1.0.4"
libraryDependencies ++= Seq(
"org.typelevel" %% "cats-core" % "1.0.1"
)
The object cats.implicits has the FutureInstances trait as a linear supertype. The FutureInstances has an implicit catsStdInstancesForFuture method, which produces a Monad[Future], which in turn is a Functor[Future].
On the other hand, the object cats.instances.future also mixes in FutureInstances, so it again provides an implicit method catsStdInstancesForFuture, but through another pathway.
Now the compiler has two possibilities to generate a Functor[Future]:
by invoking cats.instances.future.catsStdInstancesForFuture
by invoking cats.implicits.catsStdInstancesForFuture
Since it cannot decide which one to take, it exits with an error message.
To avoid that, don't use cats.implicits._ together with cats.instances.future._. Either omit one of the imports, or use the
`import packagename.objectname.{member1name, member2name}`
to select only those implicits that you need.
Adding "-print" to scalacOptions could help when debugging implicits:
scalacOptions ++= Seq(
...
"-print",
...
)
It will print out the desugared code with cats.implicits. and cats.instances.-pieces added everywhere. Unfortunately, it tends to produce quite a lot of noise.
The more fundamental reason why this happens is that there is no way to define higher-dimensional-cells (kind-of "homotopies") between the two (equivalent) pathways that lead to a Functor[Future]. If we had a possibility to tell the compiler that it doesn't matter which path to take, then everything would be much nicer. Since we can't do it, we have to make sure that there is always only one way to generate an implicit Functor[Future].
The problem is that the instances are imported twice, meaning scalac cannot disambiguate between them and doesn't know which one to use and then fails.
So either you use the implicits._ import or you import specific instances with instances.<datatype>._, but never both!
You can look at a more in depth look of cats imports here: https://typelevel.org/cats/typeclasses/imports.html

Issue with Kafka stream filtering

I'm trying to run a basic app from the following example:
https://github.com/confluentinc/examples/blob/3.3.x/kafka-streams/src/main/scala/io/confluent/examples/streams/MapFunctionScalaExample.scala
However I'm getting an exception at this line:
// Variant 1: using `mapValues`
val uppercasedWithMapValues: KStream[Array[Byte], String] = textLines.mapValues(_.toUpperCase())
Error:(33, 25) missing parameter type for expanded function ((x$1) =>
x$1.toUpperCase())
textLines.mapValues(_.toUpperCase())
Error I'm getting if I hover cursor over the code:
Type mismatch, expected: ValueMapper[_ >: String, _ <: NotInferedVR],
actual: (Any) => Any Cannot resolve symbol toUpperCase
Contents of my sbt file:
name := "untitled1"
version := "0.1"
scalaVersion := "2.11.11"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.11
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-streams
libraryDependencies += "org.apache.kafka" % "kafka-streams" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/connect-api
libraryDependencies += "org.apache.kafka" % "connect-api" % "0.11.0.0"
I'm really not sure how to proceed with that as I'm quite new to Scala. I'd like to know what's the issue and how to fix it.
From http://docs.confluent.io/current/streams/faq.html#scala-compile-error-no-type-parameter-java-defined-trait-is-invariant-in-type-t
The root cause of this problem is Scala-Java interoperability – the Kafka Streams API is implemented in Java, but your application is written in Scala. Notably, this problem is caused by how the type systems of Java and Scala interact. Generic wildcards in Java, for example, are often causing such Scala issues.
To fix the problem you would need to declare types explicitly in your Scala application in order for the code to compile. For example, you may need to break a single statement that chains multiple DSL operations into multiple statements, where each statement explicitly declares the respective return types. The StreamToTableJoinScalaIntegrationTest demonstrates how the types of return variables are explicitly declared.
Update
Kafka 2.0 (will be released in June) contains a proper Scala API that avoid those issues. Compare https://cwiki.apache.org/confluence/display/KAFKA/KIP-270+-+A+Scala+Wrapper+Library+for+Kafka+Streams

How to delete useless registers generated by Chisel verilog backend?

When I synthesize a verilog module generated by Chisel, I've got this type of warnings (a lot !) :
Warning (10036): Verilog HDL or VHDL warning at Polynomial.v(26): object "T98" assigned a value but never read
Is there an option to delete this type of "useless" signals when I generate the verilog code ?
I generate verilog with this option in scala code :
object PolynomialMain {
def main(args: Array[String]): Unit = {
chiselMain(Array("--backend", "v"), () => Module(new Polynomial()))
}
}
And here my built.sbt :
libraryDependencies += "edu.berkeley.cs" %% "chisel" % "2.3-SNAPSHOT"
scalaVersion := "2.11.6"
scalacOptions ++= Seq("-deprecation",
"-feature",
"-unchecked",
"-language:reflectiveCalls")
I don't think so (not yet, that is).
Chisel punts a lot of the heavy lifting onto the lower-level tools. For example, your synthesis tools will happily ignore extra, unused registers, and it makes Chisel simpler to not do the analysis itself.
However, this is definitely something that should be addressed in the future, as it's bad form to generate code with so many warnings!

Reproducing Shapeless examples of HList-style operations on standard tuples

I'm very new to Scala, and have been looking at the shapeless package to provide HList-like operations for Scala's tuples.
I'm running scala 2.10.5, and I've successfully installed the package (version 2.2.0-RC6) as well as all dependencies.
When I try to run the following example (from the shapeless feature overview) in the REPL,
scala> import shapeless._; import syntax.std.tuple._
scala > (23, "foo", true).head
I get the following error message:
<console>:17: error: could not find implicit value for parameter c: shapeless.ops.tuple.IsComposite[(Int, String, Boolean)]
(23, "foo", true).head
I'll bet this is a silly error on my part, and I've been digging through a lot of forums on this.
What am I missing?
Thanks in advance for your help.
You're likely missing the macro paradise dependency. Without that, I get the same error you see, with it, the example compiles.
Your build.sbt should include something like this:
libraryDependencies ++= Seq(
"com.chuusai" %% "shapeless" % "2.2.0-RC6",
compilerPlugin("org.scalamacros" % "paradise" % "2.0.1" cross CrossVersion.full)
)