My structure of scala project is pretty simple:
/someApp
/scala
/project
Dependencies.scala
...
/main
...
/test
MyTest.scala
/target
...
build.sbt
Now, let's consider:
sbt> testOnly *MyTest
It recompiles MyTest.scala and executes it as I expect. However, when I introduce changes to build.sbt or project/Dependencies.scala it ignores these changes.
Could someone explain me and understand why does it happen? The sbt seems to be one huge mystery...
To include changes made to .sbt files or .scala files under the project folder, you'll need to run the reload command within the sbt shell.
You can also force sbt to reload each time it detects a change in these files by adding this line to your build.sbt:
Global / onChangedBuildSource := ReloadOnSourceChanges
Related
In sbt show dependencyClasspath triggers compilation. Actually, it is the internalDependencyClasspath which does that. I was wondering if I can get the classpath of the inter-project dependencies for both Test and Compile scope without triggering compilation?
Here's a dirty little hack: temporarily remove all source files from project structure so that there's nothing to actually compile but target directories and project dependencies remain the same:
set every sources := Seq.empty
show dependencyClasspath
How can i run a scala script inside a sbt project which can access all classes of the sbt project and typesafe config as well? Basically I want the script to run in a similar way as the sbt console.
one option is to assemble a jar using sbt-assembly
you would need to add the following to a .sbt file to your project directory
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
and add a minimum of two lines to your build file.
assemblyJarName in assembly := "something.jar"
mainClass in assembly := Some("com.example.Main")
then you can run the 'assembly' task from sbt, this will build and executable "fat" jar with all your dependencies and configuration.
You can use the launcher and the command system to implement an interactive appliation with autocomplete etc.
Here is a tutorial:
http://www.scala-sbt.org/0.13/docs/Command-Line-Applications.html
You have to invoke the application separately, though; I don't think it is possible to run it directly from the sbt prompt within the application directory.
I'm using two files for my build: build.sbt and assembly.sbt (for building fat jars using sbt-assembly plugin). I have some vals defined in build.sbt. Let's just say I'm doing some custom tasks that depend on them. However, I noticed that vals defined in build.sbt are not visible in assembly.sbt. So I end up duplicating code in those two files. How do I configure it such that assembly.sbt can see the vals in build.sbt?
Thanks!
Currently, val in *.sbt files are meant to be namespace separated. We've debated the merits of having a global namespace or not, but in the end keeping them separate makes things a lot more consistent.
The "sbt" way to share vals and settings between build.sbt is to either:
Create a plugin which does so.
Create a "library" in the project/ directory which does so.
For option #2, you can do the following:
project/lib.scala
package mylib
object MyStuff {
val foo = "hi"
}
build.sbt
import mylib.MyStuff
// Just reference .scala code from the project/ directory.
name := MyStuff.foo
Hope that helps!
This is for Scala 2.11.1 and sbt 0.13.5.
Say I have a Scala/sbt project with the following directory structure:
root/
build.sbt
src/ ..
project/
plugins.sbt
build.properties
LolUtils.scala
and I want to use some external library in LolUtils.scala. How is this generally accomplished in sbt?
If I simply add the libs I need into build.sbt via libraryDependencies += .. then it doesn't find them and fails on the import line with not found: object ...
If I add a separate project/build.sbt, for some reason it starts failing to resolve my plugins, plus I need to manually specify the Scala version in the nested project/build.sbt, which is unnecessary duplication.
What's the best way to accomplish this?
sbt is recursive which means that it uses itself to compile a build definition, i.e. *.sbt files and *.scala files under project directory. To add extra dependencies to use them in the build definition you have to declare them in a project/build.sbt.
There is one caveat to that. You can set any scalaVersion to your project, that is in build.sbt, but you should not modify scalaVersion in the project/build.sbt as it might conflict with the version sbt itself uses (that may or may not lead to binary incompatibility for plugins).
Sbt 0.13.5 is using Scala 2.10.4, and the library you're going to use must be compatible with that particular version of Scala.
> about
[info] This is sbt 0.13.5
...
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.4
I'm making a little web project with scala, scalate and jade templates. The problem is when i'm changing .scala or .conf files, sbt automatically recompiles them and reloads project, but it doesn't do it when i'm changing my .jade files. All my templates lies under the pages/ folder in src/main and it is added on the project classpath, also i've added this line to Build.scala file in project/ folder:
unmanagedSourceDirectories <+= (sourceDirectory)(_ / "pages")
Just had a similar problem. You should change your line in the scala build file on this one:
unmanagedResourceDirectories in Compile <+= (baseDirectory)(_ / "src" / "main" / "pages")
This should solve your problem.