how to get the sub project path in sbt multi project build - scala

I am trying to get the location of sub project in multi-project build in sbt. But I am able to get only the root project directory.
lazy val copyToResources = taskKey[Unit]("copies the assembly jar.")
private val rootLocation: File = file(".").getAbsoluteFile
private val subProjectLocation: File = file("sub_project").getAbsoluteFile.getParentFile
lazy val settings = Seq(copyToResources := {
val absPath = subProjectLocation.getAbsolutePath
println(s"rootLocation:$subProjectLocation $absPath, sub-proj-location: ${rootLocation.getAbsolutePath}")
})
Output:
rootLocation:/home/user/projects/workarea/repo /home/vdinakaran/projects/workarea/repo, sub-proj-location: /home/vdinakaran/projects/workarea/repo
rootLocation:/home/user/projects/workarea/repo /home/vdinakaran/projects/workarea/repo, sub-proj-location: /home/vdinakaran/projects/workarea/repo
directory structure:
repo
|-- sub_project
As a work around , I have added the sub_project folder using the rootLocation. But why the file("sub_project") is not returning the path ?

If you define your subproject like this
lazy val subProject = project in file("sub_project") // ...
then you can get its path using the scoped baseDirectory setting:
(outdated syntax, pre sbt 1)
baseDirectory.in(subProject).value.getAbsolutePath
(new unified syntax)
(subProject / baseDirectory).value.getAbsolutePath
And in the sbt console:
> show subProject/baseDirectory
About the problem with your code (beside that you mixed up root and sub-project in the output) is the usage of relative paths. Sbt documentation on Paths explicitly says
Relative files should only be used when defining the base directory of a Project, where they will be resolved properly.
Elsewhere, files should be absolute or be built up from an absolute base File. The baseDirectory setting defines the base directory of the build or project depending on the scope.

Related

How do I create an sbt task to generate code, then include these generated managed sources in my root project?

I would like to have a sbt task that I can run to generate some code. I don't want to generate this with each run, just manually run this task once in awhile. I created a skeleton project to explain (https://github.com/jinyk/sbtmanagedsrc).
build.sbt:
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode := genSomeCodeTask.value)
/////////////////////////////////////////////////////////////
// fugly way to get managed sources compiled along with main
.settings(unmanagedSourceDirectories in Compile += baseDirectory.value / "target/scala-2.11/src_managed/")
/////////////////////////////////////////////////////////////
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
lazy val genSomeCodeTask = Def.task {
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Seq(file)
}
So with the build.sbt above I can run sbt gensomecode which creates
target/scala-2.11/src_managed/main/SomeGenCode.scala the default place that sbt puts "managed sources."
I would like to make this SomeGenCode available to the root project.
src/main/scala/Main.scala:
object Main extends App {
SomeGenCode.doSomething()
}
The only thing I can figure out to do is to include the default sourceManaged directory in the root project's unmanagedSourceDirectories (see build.sbt:line 4 aka the line below the fugly way... comment). This is ugly as hell and doesn't seem like it's how managed sources are supposed to be handled.
I'm probably not understanding something basic about sbt's managed sources concept or how to handle the situation of creating an sbt task to generate sources.
What am I missing?
There are three options that I am familiar with:
Generate into the unmanaged source directories.
Generate on every run, by adding sourceGenerators in Compile <+= gensomecode
Similar to (2), but use caching so it doesn't generate the file on every compile. Full example below.
In this example, the cache is based on the content of build.sbt, so whenever that file is changed it will regenerate the file.
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode <<= genSomeCodeTask)
sourceGenerators in Compile <+= genSomeCodeTask
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
def generateFile(sourceManaged: java.io.File) = {
val file = sourceManaged / "main" / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Set(file)
}
def genSomeCodeTask = (sourceManaged in Compile, streams).map {
(sourceManaged, streams) =>
val cachedCompile = FileFunction.cached(
streams.cacheDirectory / "mything",
inStyle = FilesInfo.lastModified,
outStyle = FilesInfo.exists) {
(in: Set[java.io.File]) =>
generateFile(sourceManaged)
}
cachedCompile(Set(file("build.sbt"))).toSeq
}
I hope I wasn't too late for the answer, but let's look at this section about Unmanaged vs Managed files
Classpaths, sources, and resources are separated into two main categories: unmanaged and managed. Unmanaged files are manually created files that are outside of the control of the build. They are the inputs to the build. Managed files are under the control of the build. These include generated sources and resources as well as resolved and retrieved dependencies and compiled classes.
It seems that the key difference between "Unmanaged vs Managed" is "Manually vs Automatically". Now, if we look at documentation for "generating files". We will notice immediately that it means "generating files automatically", since generating files will happen at sbt compile.
Compile / sourceGenerators += <task of type Seq[File]>.taskValue
It makes sense. Since anything that happened during sbt compile should be removed during sbt clean.
Now, from your code below, It seems that you were trying to generate an unmanaged source file (you were not using sourceGenerators, didn't you?), to the managed source file directory. The most obvious problem with this is, your source file will be removed every time you call sbt clean, so you have to run this task again to get this file back (worse, you have to run the task manually, opposed to having the sbt compile do it for you.), thus defeating your purpose of doing it manually once in a while.
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
To fix this, you have to manually generate files to unmanaged source, which is basically your source code directory (it depends -- mine is "/app"). Yet, you have to annotate it somehow that these files are generated by some means. My solution is something like:
val file = (scalaSource in Compile).value / "generated" / "SomeGenCode.scala"
Hope this help!

Using DependsOn between two ScalaJS SBT projects

(Long question ahead. Simplified tl;dr at the bottom).
I have two ScalaJS projects built with SBT - "myapp" and "mylib", in the following directory structure
root/build.sbt
root/myapp/build.sbt
root/myapp/jvm/
root/myapp/js/
root/myapp/shared/
root/mylib/build.sbt
root/mylib/jvm
root/mylib/js
root/mylib/shared
lib exports an artifact named "com.example:mylib:0.1", which as used as a libraryDependency for myapp.
myapp and mylib are in separate repositories, contain their own build files, and should be able to be build completely separately (i.e. they must contain their own individual build config).
In production, they will be built separately with mylib being first published as a maven artifact before building myapp separately.
In development however, I want to be able to merge these into a parent SBT project so that both can be developed in parallel without needing to use publishLocal after each change.
In a traditional (not scalajs) project this would be quite easy
$ROOT/build.sbt:
lazy val mylib = project
lazy val myapp = project.dependsOn(mylib)
However in ScalaJS, we actually have two projects inside each module - appJVM, appJS, libJVM and libJS. As such, the above configuration only finds the aggregate root project and does not correctly apply the dependsOn configuration to the actual JVM and JS projects.
(i.e. myapp and mylib build.sbt each contains two projects, and an aggregate root project)
Ideally I'd like to be able to do something like the following
lazy val mylibJVM = project
lazy val myappJVM = project.dependsOn(mylibJVM)
lazy val mylibJS = project
lazy val myappJS = project.dependsOn(myappJS)
Unfortunately this just creates new projects within the root instead of importing the subprojects themselves.
I've also tried various combinations of paths (such as)
lazy val mylibJVM = project.in(file("mylib/jvm"))
But this doesn't see configuration in build.sbt file in mylib
Ultimately I keep running up against the same problem - when importing an existing multi-project SBT project into a parent sbt file, it imports the root project, but does not seem to provide a way to import a subproject from an existing multimodule SBT file in a way that lets me add dependsOn configuration to it.
tl;dr
If I have
root/mylib/build.sbt with multiple projects defined and
root/myapp/build.sbt with multiple projects defined
Is it possible to import individual subprojects into root/build.sbt instead of the root project from the submodule?
i.e. Can I have two layers of multiproject builds.
After spending a lot of time digging through SBT source code, I managed to figure out a solution. This isn't clean, but it works. (For bonus points, it imports correctly into IntelliJ).
// Add this function to your root build.sbt file.
// It can be used to define a dependency between any
// `ProjectRef` without needing a full project definition.
def addDep(from:String, to:String) = {
buildDependencies in Global <<= (
buildDependencies in Global,
thisProjectRef in from,
thisProjectRef in to) {
(deps, fromref, toref) =>
deps.addClasspath(fromref, ResolvedClasspathDependency(toref, None))
}
}
// `project` will import the `build.sbt` file
// in the subdirectory of the same name as the `lazy val`
// (performed by an SBT macro). i.e. `./mylib/build.sbt`
//
// This won't reference the actual subprojects directly,
// will but import them into the namespace such that they
// can be referenced as "ProjectRefs", which are implicitly
// converted to from strings.
//
// We then aggregate the JVM and JS ScalaJS projects
// into the new root project we've defined. (Which unfortunately
// won't inherit anything from the child build.sbt)
lazy val mylib = project.aggregate("mylibJVM","mylibJS")
lazy val myapp = project.aggregate("myappJVM","myappJS")
// Define a root project to aggregate everything
lazy val root = project.in(file(".")).aggregate(mylib,myapp)
// We now call our custom function to define a ClassPath dependency
// between `myapp` -> `mylib` for both JVM and JS subprojects.
// In particular, this will correctly find exported artifacts
// so that `myapp` can refer to `mylib` in libraryDependencies
// without needing to use `publishLocal`.
addDep("myappJVM", "mylibJVM")
addDep("myappJS","mylibJS")

Prevent looping recompilation in SBT

I am using SBT to build my scala project. AFter the compilation of a submodule which sues fastOptJS, I need to push the compiled files to another module within the same project, I designed a custom command fastOptCopy to do so.
lazy val copyjs = TaskKey[Unit]("copyjs", "Copy javascript files to public directory")
copyjs := {
val outDir = baseDirectory.value / "public/js"
val inDir = baseDirectory.value / "js/target/scala-2.11"
val files = Seq("js-fastopt.js", "js-fastopt.js.map", "js-jsdeps.js") map { p => (inDir / p, outDir / p) }
IO.copy(files, true)
}
addCommandAlias("fastOptCopy", ";fastOptJS;copyjs")
However, when I enter into the sbt console and type
~fastOptCopy
it keeps compiling, copying, compiling, copying, ... in an infinite loop. I guess that because I am copying the files, it thinks that the sources have changed and retriggers compilation.
How can I prevent this?
You can exclude specified files from watchSources in sbt configuration
http://www.scala-sbt.org/0.13/docs/Triggered-Execution.html
watchSources defines the files for a single project that are monitored
for changes. By default, a project watches resources and Scala and
Java sources.
Here is a similar question:
How to not watch a file for changes in Play Framework
watchSources := watchSources.value.filter { _.getName != "BuildInfo.scala" }

Sbt Package Command Do Not Copy Resources

I am using sbt for a simple, small GUI projects that load icons from src/main/scala/resources. At first, everything works fine and I can compile. package, and run. The generated jar and class files all have the resource folder in it. Then I do the clean command. I re-run the compile and package, and suddently the application crashes. I check the generated jars and classes, and found out that the resources folder are not copied this time.
Running the application now gives me the NullPointerException pointing to the line where I load the resource (icon).
I didn't change the sbt build files or anything in the project. Just run clean and re-run compile and package. I don't know where to start looking for the problem. Where should I start looking? What am I doing wrong?
EDIT (the minimal example)
The project is a standard Scala template from typesafe's g8 (https://github.com/typesafehub/scala-sbt.g8). Here's my Build.Scala:
import sbt._
import sbt.Keys._
object ObdscanScalaBuild extends Build {
val scalaVer = "2.9.2"
lazy val obdscanScala = Project(
id = "obdscan-scala",
base = file("."),
settings = Project.defaultSettings ++ Seq(
name := "project name",
organization := "thesis.bert",
version := "0.1-SNAPSHOT",
scalaVersion := scalaVer,
// add other settings here
// resolvers
// dependencies
libraryDependencies ++= Seq (
"org.scala-lang" % "scala-swing" % scalaVer,
"org.rxtx" % "rxtx" % "2.1.7"
)
)
)
}
It builds the code fine previously. Here's the project code directory structure:
It works fine and output this directory inside the jar at first:
And suddently, when I do a clean and compile command via the sbt console, it didn't copy the resource directory in the jar or in the class directory (inside target) anymore. I can't do anything to get the resource directory copied to target now, except by restoring previous version and compile it one more time. I restore the previous version via Windows' history backup.
Is it clear enough? Anything I need to add?
EDIT:
After moving the files to src/main/resources, the compiled files now contains the resources. But now, I can't run it in eclipse. Here's my code:
object ControlPanelContent {
val IconPath = "/icons/"
val DefaultIcon = getClass.getResource(getIconPath("icon"))
def getImage(name: String) = {
getClass.getResource(getIconPath(name))
}
def getIconPath(name: String) = {
IconPath + name + ".png"
}
}
case class ControlPanelContent(title: String, iconName: String) extends FlowPanel {
name = title
val icon: ImageIcon = createIcon(iconName, 64)
val pageTitle = new Label(title)
protected def createIcon(name: String, size: Int): ImageIcon = {
val path: Option[URL] = Option(ControlPanelContent.getImage(name))
val img: java.awt.Image = path match {
case Some(exists) => new ImageIcon(exists).getImage
case _ => new ImageIcon(ControlPanelContent.DefaultIcon).getImage
}
val resizedImg = img.getScaledInstance(size, size, Image.SCALE_SMOOTH)
new ImageIcon(resizedImg)
}
}
The TLDR version is this, I guess:
getClass.getResource("/icons/icon.png")
which works if I call from sbt console command. Here's the result when I call the code from sbt console:
scala> getClass.getResource("/icons/icon.png")
res0: java.net.URL = file:/project/path/target/scala-2.9.2/classes/icons/icon.png
which when runned gives the following exception:
Caused by: java.lang.NullPointerException
at javax.swing.ImageIcon.<init>(Unknown Source)
at thesis.bert.gui.ControlPanelContent.createIcon(ControlPanel.scala:54)
at thesis.bert.gui.ControlPanelContent.<init>(ControlPanel.scala:33)
at thesis.bert.gui.controls.DTC$.<init>(Diagnostics.scala:283)
at thesis.bert.gui.controls.DTC$.<clinit>(Diagnostics.scala)
... 60 more
EDIT 2: It works now. I just deleted the project from eclipse, re-run sbt eclipse and it magically works. Not sure why (maybe caching?).
The SBT convention for resources is to put them in src/main/resources/, not src/main/scala/resources/. Try moving your resources folder up one level. Its content should then be included, meaning that you will get icons and indicator folders inside the generated jar file (directly at the root level, not inside a resources folder).
If you put the resources in scala, I think it copies only the files that are compiled (i.e. .class files resulting from scala compilation).
If it doesn't solve your problem, can you post the lines of code you use to load the resource?

scala sbt-launch.jar - multiple projects ihe same directory?

I'm sure this is simple, but I haven't figured it out yet...
I've installed sbt-launch.jar and a shell script to execute it (named sbt).
How do I put multiple projects in the same directory?
When I run sbt the directories project and target get created and populated, and the current project is default-XXXXX. The compile command picks up source files in the top-level directory and jar files in the top-level 'lib' directory.
How do I add another project under the same directory? Every time I run sbt in an empty directory it creates a 20+ MB project directory.
Note 1: when I run sbt I an not getting asked "Create new project?" or any other questions.
Note 2: I am using sbt-launch.jar from this url: http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-tools.sbt/sbt-launch/0.10.1/sbt-launch.jar
and I'm following the instructions at: http://code.google.com/p/simple-build-tool/wiki/Setup
Found the answer (for sbt 0.10.1):
Create the file project/Build.scala that looks like this:
import sbt._
object MyBuild extends Build
{
lazy val root = Project("root", file("."))
lazy val sub1: Project = Project("proj1", file("dir1"));
lazy val sub2 = Project("proj2", file("dir2"))
}
This creates three projects 'root' (in the top-level directory), 'proj1' (in the sub-directory 'dir1') and 'proj2' (in the sub-directory 'dir2')
For more info, see https://github.com/harrah/xsbt/wiki/Full-Configuration