Currently my build fails because the mergeStrategy isn't correct.
How can I fix this?
object MyAppBuild extends Build {
import Resolvers._
import Dependency._
import BuildSettings._
lazy val myApp = Project(
id = "myApp",
base = file("."),
settings = buildSettings ++ Seq(
resolvers := allResolvers,
exportJars := true,
libraryDependencies ++= Dependencies.catalogParserDependencies,
parallelExecution in Test := false,
//mergeStrategy in assembly := {
// ....
//}
)
)
}
If I had my settings in the build.sbt file it works like this:
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
I want to move this logic to my Build.scala file now.
Please migrate to build.sbt style. http://www.scala-sbt.org/0.13/docs/Basic-Def.html
lazy val myApp = Project(
id = "myApp",
base = file("."),
settings = buildSettings ++ ... // this is likely the problem
The *.scala style has been discouraged in the docs and, sbt 0.13.13 officially deprecates it. One of the reasons is that Project(...)'s settings parameter is not compatible with auto plugin initialization order. If you migrate to build.sbt style it should be resolved.
Related
I have a scala project which contain multiple main methods. I want to generate a fat jar so that I can run one of the main method related code.
build.sbt
lazy val commonSettings = Seq(
version := "0.1-SNAPSHOT",
organization := "my.company",
scalaVersion := "2.11.12",
test in assembly := {}
)
lazy val app = (project in file("app")).
settings(commonSettings: _*).
settings(
mainClass in assembly := Some("my.test.Category")
)
assemblyMergeStrategy in assembly := {
case "reference.conf" => MergeStrategy.concat
case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.15.0")
By this Manifest file is generated successfully in my resource folder.
next I run sbt assembly and executable jar is generated successfully.
When i run java -jar category-assembly-0.1.jar i get the following error
no main manifest attribute, in category-assembly-0.1.jar
I tried many steps given in the internet but i keep getting this error
UPDATE
Currently following is included in my build.sbt.
lazy val spark: Project = project
.in(file("./spark"))
.settings(
mainClass in assembly := Some("my.test.Category"),
mainClass in (Compile, packageBin) := Some("my.test.Category"),
mainClass in (Compile, run) := Some("my.test.Category")
)
assemblyMergeStrategy in assembly := {
case "reference.conf" => MergeStrategy.concat
case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
After building the artifacts and run the command sbt assembly and tried running the genrated jar im still getting the same error as follows
no main manifest attribute, in category-assembly-0.1.jar
I had the same issue when i was deploying on EMR.
Add the following to your setting in build.sbt and it will be alright.
lazy val spark: Project = project
.in(file("./spark"))
.settings(
...
// set the main class for packaging the main jar
mainClass in (Compile, packageBin) := Some("com.orgname.name.ClassName"),
mainClass in (Compile, run) := Some("com.orgname.name.ClassName"),
...
)
This essentially sets the name of your existing class in your project as default mainClass. i set both for packageBin and run so you should be alright.
Just don't forget to rename the com.orgname.name.ClassName to your classname.
(This is just to refresh your memory)
Classname consists of <PackageName>/<ClassName>.
For example:
package com.orgname.name
object ClassName {}
I have a project that consists of multiple smaller projects, some with dependencies upon each other, for example, there is a utility project that depends upon commons project.
Other projects may or may not depend upon utilities or commons or neither of them.
In the build.sbt I have the assembly merge strategy at the end of the file, along with the tests in assembly being {}.
My question is: is this correct, should each project have its own merge strategy and if so, will the others that depend on it inherit this strategy from them? Having the merge strategy contained within all of the project definitions seems clunky and would mean a lot of repeated code.
This question applied to the tests as well, should each project have the line for whether tests should be carried out or not, or will that also be inherited?
Thanks in advance. If anyone knows of a link to a sensible (relatively complex) example that'd also be great.
In my day job I currently work on a large multi-project. Unfortunately its closed source so I can't share specifics, but I can share some guidance.
Create a rootSettings used only by the root/container project, since it usually isn't part of an assembly or publish step. It would contain something like:
lazy val rootSettings := Seq(
publishArtifact := false,
publishArtifact in Test := false
)
Create a commonSettings shared by all the subprojects. Place the base/shared assembly settings here:
lazy val commonSettings := Seq(
// We use a common directory for all of the artifacts
assemblyOutputPath in assembly := baseDirectory.value /
"assembly" / (name.value + "-" + version.value + ".jar"),
// This is really a one-time, global setting if all projects
// use the same folder, but should be here if modified for
// per-project paths.
cleanFiles <+= baseDirectory { base => base / "assembly" },
test in assembly := {},
assemblyMergeStrategy in assembly := {
case "BUILD" => MergeStrategy.discard
case "logback.xml" => MergeStrategy.first
case other: Any => MergeStrategy.defaultMergeStrategy(other)
},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { _.data.getName.matches(".*finatra-scalap-compiler-deps.*") }
}
)
Each subproject uses commonSettings, and applies project-specific overrides:
lazy val fubar = project.in(file("fubar-folder-name"))
.settings(commonSettings: _*)
.settings(
// Project-specific settings here.
assemblyMergeStrategy in assembly := {
// The fubar-specific strategy
case "fubar.txt" => MergeStrategy.discard
case other: Any =>
// Apply inherited "common" strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(other)
}
)
.dependsOn(
yourCoreProject,
// ...
)
And BTW, if using IntelliJ. don't name your root project variable root, as this is what appears as the project name in the recent projects menu.
lazy val myProjectRoot = project.in(file("."))
.settings(rootSettings: _*)
.settings(
// ...
)
.dependsOn(
// ...
)
.aggregate(
fubar,
// ...
)
You may also need to add a custom strategy for combining reference.conf files (for the Typesafe Config library):
val reverseConcat: sbtassembly.MergeStrategy = new sbtassembly.MergeStrategy {
val name = "reverseConcat"
def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] =
MergeStrategy.concat(tempDir, path, files.reverse)
}
assemblyMergeStrategy in assembly := {
case "reference.conf" => reverseConcat
case other => MergeStrategy.defaultMergeStrategy(other)
}
I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.
I have the following build.sbt file:
version := "0.0.1"
version in Test := "0.0.1-DEBUG"
name <<= (version) apply { v:String => "demo-%s".format(v) }
and while the version seems to be right in the "test" configuration,
> show test:version
[info] 0.0.1-DEBUG
the name doesn't seem to look at the more-specific setting.
> show name
[info] demo-0.0.1
> show test:name
[info] demo-0.0.1
This is obviously a greatly-simplified example of what i'm really trying to do, but i think it illustrates the problem/misunderstanding.
EDIT (2013-07-04): What i'm really trying to do is change javaOptions in the IntegrationTest configuration (b/c we spin up a service and then run testing code against it, and i'd like the service being tested to run in a slightly sandboxed mode). Setting javaOptions in IntegrationTest is easy enough (and show it:java-options confirms), but doesn't actually get used by runner unless i go to the trouble of explicitly defining it:runner to use it:java-options. I would have expected *:runner to prefer the most-specific dependent vars.
Here's your Build.scala translated to use inConfig as suggested by #MarkHarrah:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
val mySettings = Seq(
name <<= version { v => "demo-%s".format(v) }
)
lazy val demo = Project(
id = "demo",
base = file("."),
settings = Project.defaultSettings ++ Seq(
organization := "com.demo",
scalaVersion := "2.10.0",
version := "0.0.1",
version in Test <<= version { v => "%s-DEBUG".format(v) }
) ++ mySettings
++ inConfig(Test)(mySettings)
)
}
I tried this in sbt 0.11 and 0.12.1 and it worked:
version := "0.0.1"
version in Test := "0.0.1-DEBUG"
name <<= (version) apply { v:String => "demo-%s".format(v) }
name in Test <<= (version in Test) apply { v:String => "demo-%s".format(v) }
UPDATE
If you're using a Build.scala file you can generalize this task across projects. Here's an example:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val demo = Project(
id = "demo",
base = file("."),
settings = Project.defaultSettings ++ Seq(
organization := "com.demo",
scalaVersion := "2.10.0"
) ++ addNameAndVersion("0.0.1", "demo")
)
def addNameAndVersion(projectVersion:String, projectName:String):Seq[sbt.Project.Setting[_]] = {
Seq(
version := projectVersion,
version in Test := projectVersion + "-DEBUG",
name <<= version.apply(s => "%s-%s".format(projectName, s)),
name in Test <<= (version in Test).apply(s => "%s-%s".format(projectName, s))
)
}
}
I'm developing an sbt launched application with custom command line interface.
The problem is that every time I want to test it I have to remove the previously published boot directory and then recompile and publish locally the artefacts, and then finally run the app and test it manually. Part of this is accomplished by running external shell scripts.
How could I make sbt doing the job for me? I've already made the skeleton command for it:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
commands ++= Seq(launchApp))
)
val launchApp = Command.command("launch") { state =>
state.log.info("Re-launching app")
state
}
Create launcher configuration file, e.g. fqb.build.properties in the project's main directory.
Create a script that launches the application
#!/usr/bin/env bash
java -jar /path/to/sbt-launch.jar "$#"
Define task and command:
lazy val launcherTask = TaskKey[Unit]("launch", "Starts the application from the locally published JAR")
lazy val launchApp: Seq[Setting[_]] = Seq(
commands += Command.command("publish-launch") { state =>
state.log.info("Re-launching app")
val modulesProj = modules.id
s"$modulesProj/publishLocal" ::
"publishLocal" ::
launcherTask.key.label ::
state
},
launcherTask := {
"launch #fqb.build.properties" !<
}
)
Add it as a setting to a project:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
launchApp)
)
Remember to delete old ~/.<app_name> directory when re-deploying, so the changes could take effect.