I wanted to execute a task as part of SBT compile, I tried runMain in compile but it is not executing the main class that I am providing. Below is how task looks like in build.sbt
lazy val scalaGeneratorPlugin = Project("scala-generator", file("scala-generator"))
.settings(
libraryDependencies += "org.freemarker" % "freemarker" % "2.3.23",
runMain in compile := Some("com.my.MyMainClass")
)
I am running following command:
sbt scala-generator/compile
Although it gives me success message, it does not execute my MainClass
I am copying laughedelic's answer in comment here:
I think you should use source generation in sbt for that, i.e. there should be different compilation stages.
Related
I have a project that has the following build.sbt:
addCommandAlias("package", "dist")
lazy val actual = (project in file("."))
.enablePlugins(UniversalPlugin, JavaServerAppPackaging)
.settings(
name := "DeployerPod",
mainClass := Some("com.myself.executable.Runner"),
Compile / mainClass := Some("com.myself.executable.Runner"),
Compile / run / mainClass := Some("com.myself.utils.Pipeline"),
Universal / mainClass := Some("com.myself.executable.Runner"),
Universal / compile / mainClass := Some("com.myself.executable.Runner"),
)
We have a CICD which runs a Dockerfile.
There I have sbt run as one of the steps, which will execute com.myself.utils.Pipeline class to run a Scala class and do the pre requisites for the pipeline.
As one of the last sbt based steps, I'm also running sbt package, which eventually runs an sbt dist command. At this point, inside the extracted ZIP's bin folder, I see two BAT files corresponding to the two main classes. Unfortunately I only want the Runner class BAT instead of Pipeline BAT.
For this I tried running sbt package -main com.myself.executable.Runner but that failed saying Not a valid command: -
Is there a way I can specify the mainClass only for this Universal plugin somehow? Because the way I've tried in my build.sbt doesn't seem to work.
I splitted my Unit- and Integration-Tests with a Filter:
lazy val FunTest = config("it") extend Test
def funTestFilter(name: String): Boolean = name endsWith "Spec"
def unitTestFilter(name: String): Boolean = name endsWith "Test"
...
testOptions in Test := Seq(Tests.Filter(unitTestFilter)),
testOptions in FunTest := Seq(Tests.Filter(funTestFilter)),
...
So I can do something like that:
sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport
Sadly that kills all my Coverage, only the generated BuildInfo has a Coverage.
Using only sbt clean coverage test coverageReport or sbt clean coverage it:test coverageReport work as expected.
The whole project can be found here: https://github.com/pme123/play-binding-form
scoverage Version: 1.5.1
SBT supports incremental compilation, but Scoverage does not support it. Scoverage clears instrumentation information before compilation starts and starts instrumentation process from scratch every time. Compilation of a subset of all classes with Scoverage enabled will result in wrong coverage reports.
In this case sbt-buldinfo plugin is enabled in server module. It registers source generator, which is executed before every compilation and generates server/target/scala_2.12/src_managed/main/sbt-buildinfo/BuildInfo.scala file.
SBT BuildInfo plugin is smart enough to regenerate this file only when its content changes, but since BuildInfoOption.BuildTime is included in buildInfoOptions setting,
this file is regeneraged before every compilation.
When it comes to compilation process, compiler finds one modified file (BuildInfo.scala) every time and starts incremental compilation of this one file. Scoverage clears its previous instrumentation information and saves only information about BuildInfo.scala file.
In case of execution like sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport the first compilation process is part of test task, and the second one it:test task. That's why there is no problem, when they are used separately.
Docker has nothing to do with our problem.
To fix the problem you have to prevent from BuildInfo.scala file regeneration on every compilation, at least when coverage is enabled.
I did it by modifying project/Settings.scala file in this way:
private lazy val buildInfoSettings = Seq(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions ++= { if (coverageEnabled.value) Seq() else Seq(BuildInfoOption.BuildTime) }, // <-- this line was changed
buildInfoOptions += BuildInfoOption.ToJson,
buildInfoPackage := "pme123.adapters.version"
)
buildInfoOptions does not include BuildTime option when coverage is turned on.
It doesn't look elegeant, but it works. You can probably find better way.
instead of having different buildinfo objects depending on the phase, which could lead to compilation errors, you can use your own build time.
lazy val buildTime: SettingKey[String] = SettingKey[String]("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)
This should resolve this issue.
I have this configuration in a project of mine because I wanted a better control over the way the date is formatted, and I don't have the same issue
My build.sbt:
libraryDependencies += "org" %% "A" % "0.0.1"
So I run sbt on this file:
> sbt
I know that there is Main class in 'A', let's say "mainRun.scala". But I don't how to run it from my project.
How should I run it from SBT?
By default, sbt does not look into your dependencies for the auto detection of a main class. You can can however force it to use a specific class, either on the command line with
> runMain pack.MainClass
or via the sbt setting
mainClass := Some("pack.MainClass")
I'm experiencing a problem when running latest play framework 2.3.
It compiles just fine, although when I do activator run this error happens:
java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
Full error log
I explicitly tried scalaVersion in every build.sbt file and it is the same.
I tried several things like activator clean, full removal os sbt caches and local repo sbt stuff, updating dependencies to latest version but no success.
I have scala version defined.
My current dependencies are:
I tried with both %% and force _2.11 in the name of the dependency.
Dependency List
Other important files
build.sbt
Common.scala
Dependencies.scala
When I fully clean caches it downloads scala 2.10.4 for no reason:
in this download log it says the sbt need the old scala.
Any idea why is this?
Any ideas?
Firstly, it does not matter which version of Scala is used to generate your build. The build system can use version 2.10.4, and that does not prevent your code from using version 2.11.1.
To look at the issue with your scala version, you should consider that settings added directly in build.sbt are added to the root project, but not to other projects. You can see this with a minimal project such as:
$ cat build.sbt
scalaVersion := "2.11.1"
lazy val subproj = project in (file("subproj"))
Then the output of sbt looks like this:
> show scalaVersion
[info] subproj/*:scalaVersion
[info] 2.10.4
[info] sbttest/*:scalaVersion
[info] 2.11.1
So, how can this be fixed?
We can create a lazy val containing a Seq[Setting[_]], and add it to the sub-project definition with the settings() method.
Here's an example build.sbt which adds the same settings to all projects:
$ cat build.sbt
lazy val root = project.in(file("."))
.aggregate(subproj)
.dependsOn(subproj)
.settings(commonSettings: _*)
lazy val subproj = project.in(file("subproj"))
.settings(commonSettings: _*)
lazy val commonSettings = Seq(
scalaVersion := "2.11.1"
)
The _* is required because settings() is defined as requiring Setting[_]* rather than Seq[Setting[_]], so _* converts between the two types. See also: What does param: _* mean in Scala?
And the output from sbt is:
> show scalaVersion
[info] subproj/*:scalaVersion
[info] 2.11.1
[info] root/*:scalaVersion
[info] 2.11.1
This approach can easily be applied to your build.
You're defining your dependencies in the wrong way.
When using SBT, don't use:
"org.mydep" % "mydep_2.11" % "1.0.0"
Instead use:
"org.mydep" %% "mydep" % "1.0.0"
The %% operator automatically appends the current Scala version of the current build. If you use dependencies built against other Scala versions, you're going to be getting the weird errors.
Also make sure you explicitly specify your Scala version somewhere:
scalaVersion := "2.11.1"
The problem was related to some manually added jars that I discovered in the lib folder, that were causing issues with the dependencies of the project defined in the build.sbt.
The only way to find out was to generate a activator dist and look for similar dependencies with different versions since in the dependency graph there were no conflicts
As part of a CI setup the very first step is to create a package/jar using SBT dist. The following steps are to run unit, integration and functional tests with the dist-created jar.
Is it possible to do that with SBT?
Normally I use sbt testOnly "Unit.*", but that works in the context of a project. I can't find any documentation showing how to do this when there is already a jar.
I'm using ScalaTest and I know there is a runner for it I could use http://www.scalatest.org/user_guide/using_the_runner. But using SBT would be simpler, if that is possible.
As an example, something like this is what I am looking for:
sbt testOnly "Unit.* -jar myjar.jar"
Will my tests even be included in the jar when I use the following:
sbt dist
?
EDIT
I created a new folder
I added build.sbt with the following content:
name := "abc"
version := "1.0-SNAPSHOT"
scalaVersion := "2.10.0"
I added the lib folder and copied my tests jar into it
I ran sbt testOnly Unit.*
It could not find any tests
EDIT 2
I tried with the following "proper" SBT file:
name := "ihs2tests"
version := "1.0-SNAPSHOT"
scalaVersion := "2.10.0"
unmanagedBase in Test := new java.io.File(".")
and moved test.jar into the root of the project. Again, no tests found.
Normally I use sbt testOnly "Unit.*", but that works in the context of a project. I can't find any documentation showing how to do this when there is already a jar.
The test-family tasks in SBT (with testOnly as an example) work with compile task that returns a list of compiled files through an sbt.inc.Analysis instance. I couldn't figure out how to amend it and inject the changed Analysis instance to testOnly so it knows the test I'm going to run exists.
I'm proposing another solution - a trick.
Package the tests classes to a jar with test:package task as follows:
[test-lib]> test:package
[info] Updating {file:/Users/jacek/sandbox/so/test-lib/}test-lib...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/so/test-lib/target/scala-2.10/test-classes...
[info] Packaging /Users/jacek/sandbox/so/test-lib/target/scala-2.10/test-lib_2.10-0.1-SNAPSHOT-tests.jar ...
[info] Done packaging.
[success] Total time: 9 s, completed Mar 4, 2014 11:34:13 PM
When you have the test jar, you can execute the test framework on the command line without SBT (I assume you use ScalaTest given the scalatest tag, but I'll use Specs2). Read the documentation of the test framework on how to do it and for Specs2 it's specs2.run as described in Console output.
Executing tests from the test jar requires defining a proper CLASSPATH that may or may not be easy to get right. That's where SBT can be of great help - to manage dependencies and hence the CLASSPATH.
Create another SBT project and save the test jar in lib subdirectory to have it on CLASSPATH (as described in Unmanaged dependencies).
Dependencies in lib go on all the classpaths (for compile, test, run, and console).
Add a sample build.sbt where you define your test framework as a dependency of the project. For Specs2 it's as follows (I used the default configuration as described in the Specs2 home page):
libraryDependencies += "org.specs2" %% "specs2" % "2.3.8" % "test"
scalacOptions in Test ++= Seq("-Yrangepos")
The trick is to execute the main class of the test framework, e.g. specs2.run for Specs2, as if the class were executed on the command line. SBT helps with test:runMain.
[my-another-project]> test:runMain specs2.run HelloWorldSpec
[info] Running specs2.run HelloWorldSpec
HelloWorldSpec
The 'Hello world' string should
+ contain 11 characters
+ start with 'Hello'
+ end with 'world'
Total for specification HelloWorldSpec
Finished in 62 ms
3 examples, 0 failure, 0 error
Exception: sbt.TrapExitSecurityException thrown from the UncaughtExceptionHandler in thread "run-main-0"
[success] Total time: 5 s, completed Mar 4, 2014 11:15:14 PM
Don't worry about this Exception as it comes from SBT that catches exit from Specs2 (after test execution) so SBT remains up.
It seems that SBT can't read jar file without any additional/manual setup - I might be wrong but I didn't find anything in the docs. So I tried something like this to make the task simpler:
unzip some-test.jar
java -jar sbt-launch.jar \
'set scalaSource in Test := new java.io.File(".")' \
'set fullClasspath in Test += Attributed.blank(file("."))' \
'test'
This runs without errors but does not find tests.
If I add 'set includeFilter in (Test, unmanagedSources) := "*Suite*.class"' to force it to find tests it obviously fails because it expects *.scala files, not the compiled *.class files.
I'm not an SBT expert, but I think this must be close to a solution. There must be a way to read all files from a jar path programmatically and then tell test framework to use *.class files.
At this point is seems more reasonable to run tests using a Scalatest test runner or from sbt using the project.
If you wish to dig deeper take a look at SBT source code and the default sbt shell script that does lots of setup before it runs the sbt launcher jar.