Auto-Generate Companion Object for Case Class in Scala - scala

I've defined a case class to be used as a schema for a Dataset in Spark.
I want to be able to refer to individual columns from that schema by referencing them programmatically (vs. hardcoding their string value somewhere)
For example, for the following case class
final case class MySchema(id: Int, name: String, timestamp: Long)
I would like to auto-generate the following object
object MySchema {
val id = "id"
val name = "name"
val timestamp = "timestamp"
}
The Macro approach outlined here appears to be what I want, but it won't compile under Scala 2.12. It gives the following errors which are completely baffling to me and show up in a total of 2 Google results with 0 fixes.
[error] pattern var qq$macro$2 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$2#_`
[error] case (c#q"$_ class $tpname[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") :: Nil =>
[error] ^
[error] pattern var qq$macro$19 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$19#_`
[error] case (c#q"$_ class $_[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") ::
[error] ^
[error] pattern var qq$macro$27 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$27#_`
[error] q"$mods object $tname extends { ..$earlydefns } with ..$parents { $self => ..$body }" :: Nil =>
[error] ^
Suppressing the warning as outlined won't work because the macro numbers change every time I compile.
It's also worth noting that the similar SO answer here runs into the same compiler errors as shown above
IntelliJ also complains about several parts of the macro that the compiler doesn't complain about, but that's not really an issue if I can get it to compile
Is there a way to fix that Macro approach to work in Scala 2.12 or is there a better Scala 2.12 way to do this? (I can't use Scala 2.13 or higher due to compute environment constraints)

Just checked that the macro is still working both in Scala 2.13.10 and 2.12.17.
Most probably, you didn't set up your project for macro annotations
build.sbt
//ThisBuild / scalaVersion := "2.13.10"
ThisBuild / scalaVersion := "2.12.17"
lazy val macroAnnotationSettings = Seq(
scalacOptions ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v >= 13 => Seq("-Ymacro-annotations") // for Scala 2.13
case _ => Nil
}),
libraryDependencies ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v <= 12 => // for Scala 2.12
Seq(compilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full))
case _ => Nil
})
)
lazy val core = project
.settings(
macroAnnotationSettings,
scalacOptions ++= Seq(
"-Ymacro-debug-lite", // optional, convenient to see how macros are expanded
),
)
.dependsOn(macros) // you must split your project into subprojects because macros must be compiled before core
lazy val macros = project
.settings(
macroAnnotationSettings,
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-reflect" % scalaVersion.value, // necessary for macros
),
)
project structure:
core
src
main
scala
Main.scala
macros
src
main
scala
Macros.scala
Then just do sbt clean compile.
The whole project: https://gist.github.com/DmytroMitin/2d9dbd6441ebf167aa127b80fb516afd
sbt documentation:
https://www.scala-sbt.org/1.x/docs/Macro-Projects.html
Scala documentation: https://docs.scala-lang.org/overviews/macros/annotations.html
Examples of build.sbt:
https://github.com/typelevel/simulacrum/blob/master/build.sbt
https://github.com/DmytroMitin/AUXify/blob/master/build.sbt

Related

Scala conditional compilation

I'm writing a Scala program and I want it to work with two version of a big library.
This big library's version 2 changes the API very slightly (only one class constructor signature has an extra parameter).
// Lib v1
class APIClass(a: String, b:Integer){
...
}
// Lib v2
class APIClass(a: String, b: Integer, c: String){
...
}
// And my code extends APIClass.. And I have no #IFDEF
class MyClass() extends APIClass("x", 1){ // <-- would be APIClass("x", 1, "y") in library v2
...
}
I really don't want to branch my code. Because then I'd need to maintain two branches, and tomorrow 3,4,..branches for tiny API changes :(
Ideally we'd have a simple preprocessor in Scala, but the idea was rejected long ago by Scala community.
A thing I don't really couldn't grasp is: can Scalameta help simulating a preprocessor in this case? I.e. parsing two source files conditionally to - say - an environmental variable known at compile time?
If not, how would you approach this real life problem?
1. C++ preprocessors can be used with Java/Scala if you run cpp before javac or scalac (also there is Manifold).
2. If you really want to have conditional compilation in Scala you can use macro annotation (expanding at compile time)
macros/src/main/scala/extendsAPIClass.scala
import scala.annotation.{StaticAnnotation, compileTimeOnly}
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
#compileTimeOnly("enable macro paradise")
class extendsAPIClass extends StaticAnnotation {
def macroTransform(annottees: Any*): Any = macro ExtendsAPIClassMacro.impl
}
object ExtendsAPIClassMacro {
def impl(c: blackbox.Context)(annottees: c.Tree*): c.Tree = {
import c.universe._
annottees match {
case q"$mods class $tpname[..$tparams] $ctorMods(...$paramss) extends { ..$earlydefns } with ..$parents { $self => ..$stats }" :: tail =>
def updateParents(parents: Seq[Tree], args: Seq[Tree]) =
q"""${tq"APIClass"}(..$args)""" +: parents.filter { case tq"scala.AnyRef" => false; case _ => true }
val parents1 = sys.env.get("LIB_VERSION") match {
case Some("1") => updateParents(parents, Seq(q""" "x" """, q"1"))
case Some("2") => updateParents(parents, Seq(q""" "x" """, q"1", q""" "y" """))
case None => parents
}
q"""
$mods class $tpname[..$tparams] $ctorMods(...$paramss) extends { ..$earlydefns } with ..$parents1 { $self => ..$stats }
..$tail
"""
}
}
}
core/src/main/scala/MyClass.scala (if LIB_VERSION=2)
#extendsAPIClass
class MyClass
//Warning:scalac: {
// class MyClass extends APIClass("x", 1, "y") {
// def <init>() = {
// super.<init>();
// ()
// }
// };
// ()
//}
build.sbt
ThisBuild / name := "macrosdemo"
lazy val commonSettings = Seq(
scalaVersion := "2.13.2",
organization := "com.example",
version := "1.0.0",
scalacOptions ++= Seq(
"-Ymacro-debug-lite",
"-Ymacro-annotations",
),
)
lazy val macros: Project = (project in file("macros")).settings(
commonSettings,
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-reflect" % scalaVersion.value,
)
)
lazy val core: Project = (project in file("core")).aggregate(macros).dependsOn(macros).settings(
commonSettings,
)
)
3. Alternatively you can use Scalameta for code generation (at the time before compile time)
build.sbt
ThisBuild / name := "scalametacodegendemo"
lazy val commonSettings = Seq(
scalaVersion := "2.13.2",
organization := "com.example",
version := "1.0.0",
)
lazy val common = project
.settings(
commonSettings,
)
lazy val in = project
.dependsOn(common)
.settings(
commonSettings,
)
lazy val out = project
.dependsOn(common)
.settings(
sourceGenerators in Compile += Def.task {
Generator.gen(
inputDir = sourceDirectory.in(in, Compile).value,
outputDir = sourceManaged.in(Compile).value
)
}.taskValue,
commonSettings,
)
project/build.sbt
libraryDependencies += "org.scalameta" %% "scalameta" % "4.3.10"
project/Generator.scala
import sbt._
object Generator {
def gen(inputDir: File, outputDir: File): Seq[File] = {
val finder: PathFinder = inputDir ** "*.scala"
for(inputFile <- finder.get) yield {
val inputStr = IO.read(inputFile)
val outputFile = outputDir / inputFile.toURI.toString.stripPrefix(inputDir.toURI.toString)
val outputStr = Transformer.transform(inputStr)
IO.write(outputFile, outputStr)
outputFile
}
}
}
project/Transformer.scala
import scala.meta._
object Transformer {
def transform(input: String): String = {
val (v1on, v2on) = sys.env.get("LIB_VERSION") match {
case Some("1") => (true, false)
case Some("2") => (false, true)
case None => (false, false)
}
var v1 = false
var v2 = false
input.tokenize.get.filter(_.text match {
case "// Lib v1" =>
v1 = true
false
case "// End Lib v1" =>
v1 = false
false
case "// Lib v2" =>
v2 = true
false
case "// End Lib v2" =>
v2 = false
false
case _ => (v1on && v1) || (v2on && v2) || (!v1 && !v2)
}).mkString("")
}
}
common/src/main/scala/com/api/APIClass.scala
package com.api
class APIClass(a: String, b: Integer, c: String)
in/src/main/scala/com/example/MyClass.scala
package com.example
import com.api.APIClass
// Lib v1
class MyClass extends APIClass("x", 1)
// End Lib v1
// Lib v2
class MyClass extends APIClass("x", 1, "y")
// End Lib v2
out/target/scala-2.13/src_managed/main/scala/com/example/MyClass.scala
(after sbt out/compile if LIB_VERSION=2)
package com.example
import com.api.APIClass
class MyClass extends APIClass("x", 1, "y")
Macro annotation to override toString of Scala function
How to merge multiple imports in scala?
I see some options but none if them is "conditional compilation"
you can create 2 modules in your build - they would have a shared source directory and each of them you have a source directory for code specific to it. Then you would publish 2 versions of your whole library
create 3 modules - one with your library and an abstract class/trait that it would talk to/through and 2 other with version-specific implementation of the trait
The problem is - what if you build the code against v1 and user provided v2? Or the opposite? You emitted the bytecode but JVM expects something else and it all crashes.
Virtually every time you have such compatibility breaking changes, library either refuses to update or fork. Not because you wouldn't be able to generate 2 versions - you would. Problem is in the downstream - how would your users deal with this situation. If you are writing an application you can commit to one of these. If you are writing library and you don't want to lock users to your choices... you have to publish separate version for each choice.
Theoretically you could create one project, with 2 modules, which share the same code and use different branches like #ifdef macros in C++ using Scala macros or Scalameta - but that is a disaster if you want to use IDE or publish sourcecode that your users can use in IDE. No source to look at. No way to jump to the definition's source. Disassembled byte code at best.
So the solution that you simply have separate source directories for mismatching versions is much easier to read, write and maintain in a long run.

How To Execute Basic Json feeder using Scala Jackson Library in Idea IntelliJ Editor

All i need to execute a basic Jackson library code using scala in Intellij Idea editor
I already have installed scala
C:\Users\tt>scala -version
Scala code runner version 2.12.7 -- Copyright 2002-2018, LAMP/EPFL and Lightbend, Inc.
import com.fasterxml.jackson.databind.ObjectMapper
import scala.collection.mutable
val input = scala.io.Source.fromFile("data.json").getLines()
val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
val obj = mapper.readValue[Map[String, Any]](input)
val data_collection = mutable.HashMap.empty[Int, String]
for (i <- c) {
data_collection.put(
obj.get("id").fold(0)(_.toString.toInt),
obj.get("text").fold("")(_.toString)
)
}
println(data_collection) // Map(1 -> Hello How are you)
I expect IntelliJ editor to automatcially suggest why ScalaObjectMapper and DefaultScalaModule showing as Cannot Resolve symbols despite of using correct imports
Getting errors like below
Error:(4, 1) expected class or object definition
name := "jackson-module-scala"
Error:(6, 1) expected class or object definition
organization := "com.fasterxml.jackson.module"
Error:(8, 1) expected class or object definition
scalaVersion := "2.12.8"
Error:(10, 1) expected class or object definition
crossScalaVersions := Seq("2.10.7", "2.11.12", "2.12.8",
"2.13.0-M5")
Error:(12, 1) expected class or object definition
val scalaMajorVersion = SettingKey[Int]("scalaMajorVersion")
Error:(13, 1) expected class or object definition
scalaMajorVersion := {
Error:(20, 1) expected class or object definition
scalacOptions ++= Seq("-deprecation", "-unchecked", "-feature")
Error:(27, 6) classes cannot be lazy
lazy val java7Home =
Error:(33, 1) expected class or object definition
javacOptions ++= {
Error:(41, 1) expected class or object definition
scalacOptions ++= {
Error:(45, 1) expected class or object definition
unmanagedSourceDirectories in Compile += {
Error:(49, 1) expected class or object definition
val jacksonVersion = "2.9.8"
Error:(51, 1) expected class or object definition
libraryDependencies ++= Seq(
Error:(65, 1) expected class or object definition
resourceGenerators in Compile += Def.task {
Error:(73, 1) expected class or object definition
site.settings
Error:(75, 1) expected class or object definition
site.includeScaladoc()
Error:(77, 1) expected class or object definition
ghpages.settings
Error:(79, 1) expected class or object definition
git.remoteRepo := "git#github.com:FasterXML/jackson-module-
scala.git"

Codec for ADT do not compile

I'm using the scala driver to make IO operations with mongodb. My scala version is 2.11.11 and the mongo db driver is 2.2.0.
I take the example in documentation about ADT :
sealed class Tree
case class Branch(b1: Tree, b2: Tree, value: Int) extends Tree
case class Leaf(value: Int) extends Tree
val codecRegistry = fromRegistries( fromProviders(classOf[Tree]), DEFAULT_CODEC_REGISTRY )
This code didn't compile.
No known subclasses of the sealed class
[error] val codecRegistry = fromRegistries( fromProviders(classOf[Tree]), DEFAULT_CODEC_REGISTRY )
[error] ^
[error] knownDirectSubclasses of Tree observed before subclass Branch registered
[error] knownDirectSubclasses of Tree observed before subclass Leaf registered
Did I miss something ?
Update
Below a complete example of what I'm tring to do.
build.sbt
name := "mongodb-driver-test"
version := "1.0"
scalaVersion := "2.11.11"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.2.0"
file Models.scala
import org.mongodb.scala.bson.codecs.{DEFAULT_CODEC_REGISTRY, Macros}
import org.bson.codecs.configuration.CodecRegistries.{fromProviders, fromRegistries}
/**
* Created by alifirat on 02/01/18.
*/
object Models {
sealed class Tree
case class Branch(b1: Tree, b2: Tree, value: Int) extends Tree
case class Leaf(value: Int) extends Tree
val treeCodec = Macros.createCodecProvider[Tree]()
val treeCodecRegistry = fromRegistries( fromProviders(treeCodec), DEFAULT_CODEC_REGISTRY )
}
Then, just do :
sbt compile
You will get :
[error] val treeCodec = Macros.createCodecProvider[Tree]()
[error] ^
[error] knownDirectSubclasses of Tree observed before subclass Branch registered
[error] knownDirectSubclasses of Tree observed before subclass Leaf registered
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
If I change the scala version to 2.12.0, I didn't have any errors at compile time ...
I'm using driver version 2.6.0 and Scala version 2.12.8 and still get the same problem.
My workaround is to remove the keyword sealed in front of that sealed class, compile, put it back, and then compile again. But it's very cumbersome.

Using unboundid ldap in scala ... strange compile error

I am trying to use LDAP via unboundid in scala but the compiler keeps crashing.
I just created an object that looks like this:
package utils
import com.unboundid.ldap.sdk._
object LdapHelper {
val ldap = LDAPConnection("ldap.example.com", 389)
}
I added this: "com.unboundid" % "unboundid-ldapsdk" % "2.3.1" to my appDependencies in Build.scala. I use Play 2.1 and Scala Version 2.10.1.
I get a very strange error message (see below):
The error message is so strange that i really dont know where to begin to look for hints.
Not sure if the problem is in unboundid, play, scala, sbt?
How can i successfully integrate unboundid in my scala project?
Thanks
Error in Scala compiler: assertion failed: while compiling: C:\play\todolist\app\utils\LdapHelper.scala during phase: global=typer, atPhase=parser library version: version 2.10.2 compiler version: version 2.10.2 reconstructed args: -classpath C:\play\todolist.target;C:\eclipse\scala-SDK-3.0.1-vfinal-2.10-win32.win32.x86_64\configuration\org.eclipse.
...
last tree to typer: Ident(LDAPConnection)
symbol: (flags: )
symbol definition:
symbol owners:
context owners: value ldap -> object LdapHelper -> package utils
== Enclosing template or block ==
Template( // val : in object LdapHelper
"java.lang.Object" // parents
ValDef(
private
"_"
)
// 3 statements
DefDef( // def : in object LdapHelper
""
[]
List(Nil)
Block(
Apply(
super.""
Nil
)
()
)
)
DefDef( // def x: in object LdapHelper
"x"
[]
Nil
()
)
ValDef( // private[this] val ldap: in object LdapHelper
private
"ldap"
Apply(
"LDAPConnection"
// 2 arguments
"ldap.example.com"
389
)
)
)
There was a warning that turned into an assert in Scala 2.10.2 causing this.
There is a bug open here:
https://issues.scala-lang.org/browse/SI-7014
And a fix staged for 2.10.4:
https://github.com/scala/scala/pull/2829
You can ask Play to use Scala 2.10.4-SNAPSHOT by using the following Build.scala:
import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "AppName"
val appVersion = "1.0-SNAPSHOT"
val mainDeps = Seq(
jdbc,
anorm,
cache
)
lazy val main = play.Project(appName, appVersion, mainDeps).settings(
scalaVersion := "2.10.4-SNAPSHOT"
)
}
If you are using build.sbt the file would look like:
import play.Project._
playScalaSettings
name := "AppName"
version := "1.0-SNAPSHOT"
scalaVersion := "2.10.4-SNAPSHOT"
libraryDependencies ++= Seq(jdbc, anorm, cache)
Note: if building from sbt (instead of play) you may have to add a repository resolver under the scalaVersion line such as:
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/repo/"
The answer from #jeckhart works.
Firstly I use Scala 2.10.4-RC1 to build the Play 2.3 SNAPSHOT. Then use the output to compile with UnboundID.
Finally everything compiles with no assertion or error.
To build Play 2.3 SNAPSHOT using Scala 2.10.4-RC1, I modified the file framework/project/Build.scala.
Change these two section from
val buildScalaVersion = propOr("scala.version", "2.10.3")
val buildScalaVersionForSbt = propOr("play.sbt.scala.version", "2.10.3")
to
val buildScalaVersion = propOr("scala.version", "2.10.4-RC1")
val buildScalaVersionForSbt = propOr("play.sbt.scala.version", "2.10.4-RC1")

Writing a custom matcher for NodeSeq

I'm trying to write a simple custom matcher for NodeSeq, with scalatest v.2.0.M5b.
package test
import org.scalatest.matchers.{MatchResult, Matcher, ShouldMatchers}
import scala.xml.NodeSeq
import org.scalatest.FunSpec
class MySpec extends FunSpec with ShouldMatchers with MyMatcher {
describe("where is wrong?") {
it("showOK") {
val xml = <span>abc</span>
xml should contn("b")
}
}
}
trait MyMatcher {
class XmlMatcher(str: String) extends Matcher[NodeSeq] {
def apply(xml: NodeSeq) = {
val x = xml.toString.contains(str)
MatchResult(
x,
"aaa",
"bbb"
)
}
}
def contn(str: String) = new XmlMatcher(str)
}
When I compile it, it reports error:
[error] /Users/freewind/src/test/scala/test/MyMacher.scala:14: overloaded method value should with alternatives:
[error] (beWord: MySpec.this.BeWord)MySpec.this.ResultOfBeWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (notWord: MySpec.this.NotWord)MySpec.this.ResultOfNotWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (haveWord: MySpec.this.HaveWord)MySpec.this.ResultOfHaveWordForSeq[scala.xml.Node] <and>
[error] (rightMatcher: org.scalatest.matchers.Matcher[scala.collection.GenSeq[scala.xml.Node]])Unit
[error] cannot be applied to (MySpec.this.XmlMatcher)
[error] xml should contn("b")
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
Where is wrong?
Update:
The build.sbt file I use:
name := "scalatest-test"
scalaVersion := "2.10.1"
version := "1.0"
resolvers ++= Seq("snapshots" at "http://oss.sonatype.org/content/repositories/snapshots",
"releases" at "http://oss.sonatype.org/content/repositories/releases",
"googlecode" at "http://sass-java.googlecode.com/svn/repo"
)
libraryDependencies += "org.scalatest" %% "scalatest" % "2.0.M5b" % "test"
And a demo project: https://github.com/freewind/scalatest-test
For the reason why scala compiler complains see this answer.
But it seems that ScalaTest API has changed quite a bit since then, so the two solutions proposed both need some modification (tested for ScalaTest 2.0.M5b):
Replace All instances of NodeSeq to GenSeq[Node] so that the type matches everywhere.
See SeqShouldWrapper class of ScalaTest.
Alternatively, wrap xml explicitely with the conversion function, i.e. manually setting the required type but I don't recommend this because it makes client code ugly.
new AnyRefShouldWrapper(xml).should(contn("b"))
BTW, it is good to have a small but complete project on github for others to tweak. It makes this question much more attractive.