SBT[1.1.1] Different libraryDependencies for different Scala Versions - scala

I have tried the solution from: SBT cross building - choosing a different library version for different scala version however this results in
build.sbt:27: error: No implicit for Append.Value[Seq[sbt.librarymanagement.ModuleID], sbt.Def.Initialize[sbt.librarymanagement.ModuleID]] found,
so sbt.Def.Initialize[sbt.librarymanagement.ModuleID] cannot be appended to Seq[sbt.librarymanagement.ModuleID]
libraryDependencies += scalaVersion(jsonDependency(_)),
^
[error] sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
What is the correct way of forcing library dependencies for different Scala versions in sbt 1.1.1?
build.sbt:
libraryDependencies += scalaVersion(jsonDependency(_))
def jsonDependency(scalaVersion: String) = scalaVersion match {
case "2.11.7" => "com.typesafe.play" %% "play-json" % "2.4.2"
case "2.12.4" => "com.typesafe.play" %% "play-json" % "2.6.9"
}

The first line should be:
libraryDependencies += jsonDependency(scalaVersion.value)
As for the rest, it's unnecessarily sensitive to exact Scala version numbers. Consider using CrossVersion.partialVersion to be sensitive to the Scala major version only, as follows:
def jsonDependency(scalaVersion: String) =
"com.typesafe.play" %% "play-json" %
(CrossVersion.partialVersion(scalaVersion) match {
case Some((2, 11)) => "2.4.2"
case _ => "2.6.9"
})

Related

AssertionError on Scala 2.13.10 project when not using any assert statement

I'm getting the following error code when running sbt clean assembly on a Scala 2.13.10 project using JVM 1.8 and IntelliJ IDEA 2022.3.1 (Community Edition). I think it's not caused by any assert included in my code.
java.lang.AssertionError: assertion failed:
List(method apply$mcI$sp, method apply$mcI$sp)
while compiling: D:\repo\git\external\project9\src\main\scala\es\package\graph\KNNGraphBuilder.scala
during phase: globalPhase=specialize, enteringPhase=explicitouter
library version: version 2.13.10
compiler version: version 2.13.10
reconstructed args:
last tree to typer: Ident(x1)
tree position: line 270 of D:\repo\git\external\project9\src\main\scala\es\package\graph\KNNGraphBuilder.scala
tree tpe: x1.type
symbol: case value x1
symbol definition: case val x1: (Int, es.package.graph.NeighborsForElement) (a TermSymbol)
symbol package: es.package.graph
symbol owners: value x1 -> method $anonfun$addElements -> method addElements -> class GroupedNeighborsForElement
call site: method neighborsWithComparisonCountOfGroup in class GroupedNeighborsForElementWithComparisonCount in package graph
== Source file context for tree position ==
267
268 def addElements(n:GroupedNeighborsForElement):Unit=
269 {
270 for ((k,v) <- n.neighbors)
271 addElementsOfGroup(k,v)
272 }
273
n.neighbors is returning Map[Int,NeighborsForElement] here. Other relevant code can be found below:
def addElementsOfGroup(groupId:Int, n:NeighborsForElement):Unit=
{
getOrCreateGroup(groupId).addElements(n)
}
private def getOrCreateGroup(groupId:Int):NeighborsForElement=
{
val g=neighbors.get(groupId)
if (g.isDefined) return g.get
val newGroup=new NeighborsForElement(numNeighbors)
neighbors(groupId)=newGroup
return newGroup
}
This is the build.sbt file:
name := "project9"
version := "0.1"
organization := "es.package"
scalaVersion := "2.13.10"
val sparkVersion = "3.3.1"
resolvers ++= Seq(
"apache-snapshots" at "https://repository.apache.org/snapshots/"
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-mllib" % sparkVersion,
"org.scalatest" %% "scalatest" % "3.2.0"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
Any ideas what's wrong with this code? Thanks

How to optionally add a compiler plugin in SBT

I have a project which depends on https://github.com/typelevel/kind-projector and is currently cross-compiled against scala 2.12 and 2.13, and I want to add support for scala 3.0. However, kind-projector is not available on scala 3.0 since the syntax it enables is part of the native scala 3 syntax.
Before, I used this setting to add the compiler plugin:
addCompilerPlugin("org.typelevel" % "kind-projector" % "0.11.3" cross CrossVersion.full)
Now, I'm trying to disable that setting if the scalaVersion is 3.0.0.
The closest I've got is
Def.setting {
scalaVersion.value match {
case "3.0.0" => new Def.SettingList(Nil)
case _ => Seq(
addCompilerPlugin("org.typelevel" % "kind-projector" % "0.11.3" cross CrossVersion.full)
)
}
}
but the types don't work out (that's an Initialize but it needs to be a Setting).
How can I conditionally disable the compiler plugin based on the scala version?
addCompilerPlugin is a shortcut for libraryDependencies += compilerPlugin(dependency)
Thus, it should work with something like this
libraryDependencies ++= {
scalaVersion.value match {
case "3.0.0" =>
Nil
case _ =>
List(compilerPlugin("org.typelevel" % "kind-projector" % "0.11.3" cross CrossVersion.full))
}
}
There might be a better way to do it though.
Original answer which doesn't work because scalaVersion.value is not available in this context:
scalaVersion.value match {
case "3.0.0" =>
new Def.SettingList(Nil)
case _ =>
addCompilerPlugin("org.typelevel" % "kind-projector" % "0.11.3" cross CrossVersion.full)
}

Conflicting files in uber-jar creation in SBT using sbt-assembly

I am trying to compile and package a fat jar using SBT and I keep running into the following error. I have tried everything from using library dependency exclude and merging.
[trace] Stack trace suppressed: run last *:assembly for the full output.
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /Users/me/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1 .7.10.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] /Users/me/.ivy2/cache/com.twitter/parquet-format/jars/parquet-format-2.2.0-rc1.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] Total time: 113 s, completed Jul 10, 2015 1:57:21 AM
The current incarnation of my build.sbt file is below:
import AssemblyKeys._
assemblySettings
name := "ldaApp"
version := "0.1"
scalaVersion := "2.10.4"
mainClass := Some("myApp")
libraryDependencies +="org.scalanlp" %% "breeze" % "0.11.2"
libraryDependencies +="org.scalanlp" %% "breeze-natives" % "0.11.2"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
libraryDependencies +="org.ini4j" % "ini4j" % "0.5.4"
jarName in assembly := "myApp"
net.virtualvoid.sbt.graph.Plugin.graphSettings
libraryDependencies += "org.slf4j" %% "slf4j-api"" % "1.7.10" % "provided"
I realize I am doing something wrong...I just have no idea what.
Here is how you can handle these merge issues.
import sbtassembly.Plugin._
lazy val assemblySettings = sbtassembly.Plugin.assemblySettings ++ Seq(
publishArtifact in packageScala := false, // Remove scala from the uber jar
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "CHANGES.txt") => MergeStrategy.first
// ...
case PathList(ps # _*) if ps.last endsWith "pom.properties" => MergeStrategy.first
case x => old(x)
}
}
)
Then add these settings to your project.
lazy val projectToJar = Project(id = "MyApp", base = file(".")).settings(assemblySettings: _*)
I got your assembly build running by removing spark from the fat jar (mllib is already included in spark).
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" % "provided"
Like vitalii said in a comment, this solution was already here. I understand that spending hours on a problem without finding the fix can be frustrating but please be nice.

Why is sbt using incorrect version number for declared dependencies?

I have a sbt build file that use 1 plugin and 3 dependencies:
scalaVersion := "2.10.4"
val reflect = Def.setting { "org.scala-lang" % "scala-reflect" % "2.10.4" }
val compiler = Def.setting { "org.scala-lang" % "scala-compiler" % "2.10.4" }
lazy val macrosSettings = Project.defaultSettings ++ Seq(
addCompilerPlugin("org.scala-lang.plugins" % "macro-paradise_2.10.4-SNAPSHOT" % "2.0.0-SNAPSHOT"),
libraryDependencies ++= {
import Dependencies._
Seq(play_json, specs2, reflect.value)
}
)
lazy val Macros = Project(id="IScala-Macros", base=file("macros"), settings=macrosSettings)
However the compiler gave me the following error in compiling IScala-Macros:
[warn] :: org.scala-lang#scala-compiler;2.10.4-SNAPSHOT: not found
[warn] :: org.scala-lang#scala-library;2.10.4-SNAPSHOT: not found
[warn] :: org.scala-lang#scala-reflect;2.10.4-SNAPSHOT: not found
this seems like a bug as I don't want them to resolve to 2.10.4-SNAPSHOT, but only 2.10.4, is it a bug of sbt? If not, where does this SNAPSHOT come from?
There are a couple of issues in this build.sbt build definition so I highly recommend reading the document Macro Paradise where you can find the link to a project that for an end-to-end example, but in a nutshell working with macro paradise is as easy as adding the following two lines to your build (granted you’ve already set up SBT to use macros).
As to the issues in this build, I don't see a reason for Def.setting for the depdendencies reflect and compiler, and moreover I'm unsure about the dependency in addCompilerPlugin. Use the one below where Def.setting is used to refer to the value of the scalaVersion setting. I still think addCompilerPlugin should follow the sample project above.
import Dependencies._
scalaVersion := "2.10.4"
val reflect = Def.setting {
"org.scala-lang" % "scala-reflect" % scalaVersion.value
}
val compiler = Def.setting {
"org.scala-lang" % "scala-compiler" % scalaVersion.value
}
lazy val macrosSettings = Project.defaultSettings ++ Seq(
addCompilerPlugin("org.scala-lang.plugins" % "macro-paradise_2.10.4-SNAPSHOT" % "2.0.0-SNAPSHOT"),
libraryDependencies ++= Seq(
play_json,
specs2,
reflect.value
)
)
lazy val Macros = Project(id="IScala-Macros", base=file("macros"), settings=macrosSettings)

How do I exclude/include specific packages using xsbt-proguard-plugin?

I'm using xsbt-proguard-plugin, which is an SBT plugin for working with Proguard.
I'm trying to come up with a Proguard configuration for a Hive Deserializer I've written, which has the following dependencies:
// project/Dependencies.scala
val hadoop = "org.apache.hadoop" % "hadoop-core" % V.hadoop
val hive = "org.apache.hive" % "hive-common" % V.hive
val serde = "org.apache.hive" % "hive-serde" % V.hive
val httpClient = "org.apache.httpcomponents" % "httpclient" % V.http
val logging = "commons-logging" % "commons-logging" % V.logging
val specs2 = "org.specs2" %% "specs2" % V.specs2 % "test"
Plus an unmanaged dependency:
// lib/UserAgentUtils-1.6.jar
Because most of these are either for local unit testing or are available within a Hadoop/Hive environment anyway, I want my minified jarfile to only include:
The Java classes SnowPlowEventDeserializer.class and SnowPlowEventStruct.class
org.apache.httpcomponents.httpclient
commons-logging
lib/UserAgentUtils-1.6.jar
But I'm really struggling to get the syntax right. Should I start from a whitelist of classes I want to keep, or explicitly filter out the Hadoop/Hive/Serde/Specs2 libraries? I'm aware of this SO question but it doesn't seem to apply here.
If I initially try the whitelist approach:
// Should be equivalent to sbt> package
import ProguardPlugin._
lazy val proguard = proguardSettings ++ Seq(
proguardLibraryJars := Nil,
proguardOptions := Seq(
"-keepattributes *Annotation*,EnclosingMethod",
"-dontskipnonpubliclibraryclassmembers",
"-dontoptimize",
"-dontshrink",
"-keep class com.snowplowanalytics.snowplow.hadoop.hive.SnowPlowEventDeserializer",
"-keep class com.snowplowanalytics.snowplow.hadoop.hive.SnowPlowEventStruct"
)
)
Then I get a Hadoop processing error, so clearly Proguard is still trying to bundle Hadoop:
proguard: java.lang.IllegalArgumentException: Can't find common super class of [[Lorg/apache/hadoop/fs/FileStatus;] and [[Lorg/apache/hadoop/fs/s3/Block;]
Meanwhile if I try Proguard's filtering syntax to build up the blacklist of libraries I don't want to include:
import ProguardPlugin._
lazy val proguard = proguardSettings ++ Seq(
proguardLibraryJars := Nil,
proguardOptions := Seq(
"-keepattributes *Annotation*,EnclosingMethod",
"-dontskipnonpubliclibraryclassmembers",
"-dontoptimize",
"-dontshrink",
"-injars !*hadoop*.jar"
)
)
Then this doesn't seem to work either:
proguard: java.io.IOException: Can't read [/home/dev/snowplow-log-deserializers/!*hadoop*.jar] (No such file or directory)
Any help greatly appreciated!
The whitelist is the proper approach: ProGuard should get a complete context, so it can properly shake out classes, fields, and methods that are not needed.
The error "Can't find common super class" suggests that some library is still missing from the input. ProGuard probably warned about it, but the configuration appears to contain the option -ignorewarnings or -dontwarn (which should be avoided). You should add the library with -injars or -libraryjars.
If ProGuard then includes some classes that you weren't expecting in the output, you can get an explanation with "-whyareyoukeeping class somepackage.SomeUnexpectedClass".
Starting from a working configuration, you can still try to filter out classes or entire jars from the input. Filters are added to items in a class path though, not on their own, e.g. "-injars some.jar(!somepackage/**.class)" -- cfr. the manual. This can be useful if the input contains test classes that drag in other unwanted classes.
In the end, I couldn't get past duplicate class errors using Proguard, let alone how to figure out how to filter out the relevant jars, so finally switched to a much cleaner sbt-assembly approach:
-1. Added the sbt-assembly plugin to my project as per the README
-2. Updated the appropriate project dependencies with a "provided" flag to stop them being added into my fat jar:
val hadoop = "org.apache.hadoop" % "hadoop-core" % V.hadoop % "provided"
val hive = "org.apache.hive" % "hive-common" % V.hive % "provided"
val serde = "org.apache.hive" % "hive-serde" % V.hive % "provided"
val httpClient = "org.apache.httpcomponents" % "httpclient" % V.http
val httpCore = "org.apache.httpcomponents" % "httpcore" % V.http
val logging = "commons-logging" % "commons-logging" % V.logging % "provided"
val specs2 = "org.specs2" %% "specs2" % V.specs2 % "test"
-3. Added an sbt-assembly configuration like so:
import sbtassembly.Plugin._
import AssemblyKeys._
lazy val sbtAssemblySettings = assemblySettings ++ Seq(
assembleArtifact in packageScala := false,
jarName in assembly <<= (name, version) { (name, version) => name + "-" + version + ".jar" },
mergeStrategy in assembly <<= (mergeStrategy in assembly) {
(old) => {
case "META-INF/NOTICE.txt" => MergeStrategy.discard
case "META-INF/LICENSE.txt" => MergeStrategy.discard
case x => old(x)
}
}
)
Then typing assembly produced a "fat jar" with just the packages I needed in it, including the unmanaged dependency and excluding Hadoop/Hive etc.