There are two cases I have where some DSL specific code needs
to be compiled into Scala and/or Java code. Is there a ready
reference example that demonstrates
how to specify dependency relation
executing a relevant command to compile the DSL
One case is that of an ONCRPC IDL compiler. A ".x"
IDL specification file is compiled into Java. Another case
is of compiling a state map DSL (SMC) code into Scala.
While learning Scala and Functional programming, and at
the same time applying to SBT seems steep, and need help.
Related
I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.
https://docs.scala-lang.org/overviews/compiler-options/index.html says
Scala compiler scalac offers various compiler options, also referred to as compiler flags, to change how to compile your program.
Nowadays, most people are not running scalac from the command line. Instead, they use sbt, an IDE, and other tools as their interface to the compiler. Therefore they may not even have scalac installed, and won’t think to do man scalac.
Does "the compiler" refer to scalac?
If yes, is "they use sbt, an IDE, and other tools as their interface to the compiler" contrary to "therefore they may not even have scalac installed"?
Does sbt rely on scalac?
Thanks.
Scala compiler can be accessed programmatically via an API packaged by scala-compiler.jar dependency, hence tools such as IDEs and SBT can implement their own client frontends over this API to drive compiler functionality. scalac is just a bash script that executes scala.tools.nsc.MainClass class from scala-compiler.jar.
Does sbt rely on scalac?
No, sbt uses compiler API directly. One of the key concepts to understand regarding sbt is that the build definition is itself Scala code, either vanilla or DSL, but Scala nevertheless. The version of Scala used to compile the build definition is separate from the version of Scala used to compile project proper. The build definition source code in build.sbt and project/*.scala will be compiled using Scala version specified indirectly via sbt.version=1.2.8 setting in project/build.properties, whilst project source code proper in src/main/scala/* will be compiled using Scala version specified directly via scalaVersion := "2.13.1" setting in build.sbt. Note how they can indeed differ. Think of the build definition as simply another Scala app which uses sbt API for its implementation.
Suppose I have a Scala compile-time macro that I find useful and would like to share it (I do). How do I create a JAR file that when loaded into another project would execute the macro when compiling the new project?
Specifically, I've made a StaticAnnotation that rewrites the AST of the class that it wraps before compile time. This works in my Maven build (macro defined in the main directory, runs on test cases in the test directory) because I have
<compilerPlugins>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_2.10.5</artifactId>
<version>2.1.0-M5</version>
</compilerPlugin>
</compilerPlugins>
in my scala-maven-plugin. (I'm starting with a Scala 2.10 project and if it works, will provide both 2.10 and 2.11.)
But if I put the resulting JAR on a Scala console classpath, in a Scala script, or into another Maven project (without special compiler plugins), it simply ignores the macro: the AST does not get overwritten and my compile-time println statements don't execute. If I use the #compileTimeOnly annotation on my macro (new in Scala 2.11), then it complains with the #compileTimeOnly error message.
Do I really need to tell my users to add compiler plugins in their pom.xml files, as well as alternate instructions for SBT and other build tools? Other packages containing macros (MacWire, Log4s) don't come with complicated build instructions: they just say, "point to this dependency in Maven Central." I couldn't find the magic in their build process that makes this work. What am I missing?
If you're relying on a macro-paradise-only feature then yes, you do need to tell your users to add compiler plugins. See http://docs.scala-lang.org/overviews/macros/annotations.html . The projects you mention are only using the scala compiler's built-in (non-paradise) macro features, not macro annotations.
I've created Java annotations (since I need run time retention) under $PROJECT/src/main/java and my scala codewhich uses these java annotations us under $PROJECT/src/main/scala. The Java annotation thus created also makes use of a Java ENUM as it's value.
If I compile the project then sbt doesn't seem to compile the Java annotations first and errors out on each usage of the enum in annotations. If I comment out all usages of the Java enum in annotations in scala code and do a compile, uncomment enum usage and compile again it all works fine.
How do I ensure that sbt compiles my java annotations and enum (i.e. $PROJECT/src/main/java) before attempting to compile scala code when doing a clean build?
EDIT: I have a bare bones build.sbt and am using sbt 0.11.2
Some good news: This is a known issue and has been resolved.
Some bad news: It's resolved in 2.10 and the fix may not be backported to 2.9.3 (quoting Paul Phillips in the issue thread):
I've tagged this for backporting, which is not a guarantee; I don't
have time to do it right now but I expect to in the near future.
Some good news: If you're stuck on pre-2.10 and your Java sources don't depend on your Scala sources, you can just add the following to your build.sbt and all is well:
compileOrder := CompileOrder.JavaThenScala
Some bad news: If you're stuck on pre-2.10 and your Java sources do depend on your Scala sources, I'm pretty sure you're out of luck, and the comment-compile-uncomment trick is probably your best bet.
I'll bet you're facing SI-2764. This has been fixed in Scala 2.10.
In the meantime, create a separate sub-project for your Java annotations, and depend on this from the project containing the Scala code.. Then the Scala compiler will only process the .class files, rather the .java files.
Is is possible to compile and execute scala code as a string at runtime either in Scala or in Java?
My idea is to build a DSL using Scala then let Java programmers use the DSL inside Java.
I heard that the class scala.tools.nsc.Interpreter can do something like that, but when I imported it inside my scala file, I got "object tools is not a member of package scala."
So could anybody give me a hint?
In 2.10.0 we expose Scala reflection API, which among everything else includes a runtime compilation facility. More details can be found here: Generating a class from string and instantiating it in Scala 2.10.
I recommend you twitter-util's Eval
For scala3 this can be now achieved with dotty:
https://index.scala-lang.org/lampepfl/dotty
https://github.com/lampepfl/dotty
https://dotty.epfl.ch/docs/internals/overall-structure.html
The sbt dependency is e.g. "org.scala-lang" %% "scala3-compiler" % "3.1.3"