Sbt Package Command Do Not Copy Resources - scala

I am using sbt for a simple, small GUI projects that load icons from src/main/scala/resources. At first, everything works fine and I can compile. package, and run. The generated jar and class files all have the resource folder in it. Then I do the clean command. I re-run the compile and package, and suddently the application crashes. I check the generated jars and classes, and found out that the resources folder are not copied this time.
Running the application now gives me the NullPointerException pointing to the line where I load the resource (icon).
I didn't change the sbt build files or anything in the project. Just run clean and re-run compile and package. I don't know where to start looking for the problem. Where should I start looking? What am I doing wrong?
EDIT (the minimal example)
The project is a standard Scala template from typesafe's g8 (https://github.com/typesafehub/scala-sbt.g8). Here's my Build.Scala:
import sbt._
import sbt.Keys._
object ObdscanScalaBuild extends Build {
val scalaVer = "2.9.2"
lazy val obdscanScala = Project(
id = "obdscan-scala",
base = file("."),
settings = Project.defaultSettings ++ Seq(
name := "project name",
organization := "thesis.bert",
version := "0.1-SNAPSHOT",
scalaVersion := scalaVer,
// add other settings here
// resolvers
// dependencies
libraryDependencies ++= Seq (
"org.scala-lang" % "scala-swing" % scalaVer,
"org.rxtx" % "rxtx" % "2.1.7"
)
)
)
}
It builds the code fine previously. Here's the project code directory structure:
It works fine and output this directory inside the jar at first:
And suddently, when I do a clean and compile command via the sbt console, it didn't copy the resource directory in the jar or in the class directory (inside target) anymore. I can't do anything to get the resource directory copied to target now, except by restoring previous version and compile it one more time. I restore the previous version via Windows' history backup.
Is it clear enough? Anything I need to add?
EDIT:
After moving the files to src/main/resources, the compiled files now contains the resources. But now, I can't run it in eclipse. Here's my code:
object ControlPanelContent {
val IconPath = "/icons/"
val DefaultIcon = getClass.getResource(getIconPath("icon"))
def getImage(name: String) = {
getClass.getResource(getIconPath(name))
}
def getIconPath(name: String) = {
IconPath + name + ".png"
}
}
case class ControlPanelContent(title: String, iconName: String) extends FlowPanel {
name = title
val icon: ImageIcon = createIcon(iconName, 64)
val pageTitle = new Label(title)
protected def createIcon(name: String, size: Int): ImageIcon = {
val path: Option[URL] = Option(ControlPanelContent.getImage(name))
val img: java.awt.Image = path match {
case Some(exists) => new ImageIcon(exists).getImage
case _ => new ImageIcon(ControlPanelContent.DefaultIcon).getImage
}
val resizedImg = img.getScaledInstance(size, size, Image.SCALE_SMOOTH)
new ImageIcon(resizedImg)
}
}
The TLDR version is this, I guess:
getClass.getResource("/icons/icon.png")
which works if I call from sbt console command. Here's the result when I call the code from sbt console:
scala> getClass.getResource("/icons/icon.png")
res0: java.net.URL = file:/project/path/target/scala-2.9.2/classes/icons/icon.png
which when runned gives the following exception:
Caused by: java.lang.NullPointerException
at javax.swing.ImageIcon.<init>(Unknown Source)
at thesis.bert.gui.ControlPanelContent.createIcon(ControlPanel.scala:54)
at thesis.bert.gui.ControlPanelContent.<init>(ControlPanel.scala:33)
at thesis.bert.gui.controls.DTC$.<init>(Diagnostics.scala:283)
at thesis.bert.gui.controls.DTC$.<clinit>(Diagnostics.scala)
... 60 more
EDIT 2: It works now. I just deleted the project from eclipse, re-run sbt eclipse and it magically works. Not sure why (maybe caching?).

The SBT convention for resources is to put them in src/main/resources/, not src/main/scala/resources/. Try moving your resources folder up one level. Its content should then be included, meaning that you will get icons and indicator folders inside the generated jar file (directly at the root level, not inside a resources folder).
If you put the resources in scala, I think it copies only the files that are compiled (i.e. .class files resulting from scala compilation).
If it doesn't solve your problem, can you post the lines of code you use to load the resource?

Related

How do I create an sbt task to generate code, then include these generated managed sources in my root project?

I would like to have a sbt task that I can run to generate some code. I don't want to generate this with each run, just manually run this task once in awhile. I created a skeleton project to explain (https://github.com/jinyk/sbtmanagedsrc).
build.sbt:
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode := genSomeCodeTask.value)
/////////////////////////////////////////////////////////////
// fugly way to get managed sources compiled along with main
.settings(unmanagedSourceDirectories in Compile += baseDirectory.value / "target/scala-2.11/src_managed/")
/////////////////////////////////////////////////////////////
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
lazy val genSomeCodeTask = Def.task {
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Seq(file)
}
So with the build.sbt above I can run sbt gensomecode which creates
target/scala-2.11/src_managed/main/SomeGenCode.scala the default place that sbt puts "managed sources."
I would like to make this SomeGenCode available to the root project.
src/main/scala/Main.scala:
object Main extends App {
SomeGenCode.doSomething()
}
The only thing I can figure out to do is to include the default sourceManaged directory in the root project's unmanagedSourceDirectories (see build.sbt:line 4 aka the line below the fugly way... comment). This is ugly as hell and doesn't seem like it's how managed sources are supposed to be handled.
I'm probably not understanding something basic about sbt's managed sources concept or how to handle the situation of creating an sbt task to generate sources.
What am I missing?
There are three options that I am familiar with:
Generate into the unmanaged source directories.
Generate on every run, by adding sourceGenerators in Compile <+= gensomecode
Similar to (2), but use caching so it doesn't generate the file on every compile. Full example below.
In this example, the cache is based on the content of build.sbt, so whenever that file is changed it will regenerate the file.
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode <<= genSomeCodeTask)
sourceGenerators in Compile <+= genSomeCodeTask
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
def generateFile(sourceManaged: java.io.File) = {
val file = sourceManaged / "main" / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Set(file)
}
def genSomeCodeTask = (sourceManaged in Compile, streams).map {
(sourceManaged, streams) =>
val cachedCompile = FileFunction.cached(
streams.cacheDirectory / "mything",
inStyle = FilesInfo.lastModified,
outStyle = FilesInfo.exists) {
(in: Set[java.io.File]) =>
generateFile(sourceManaged)
}
cachedCompile(Set(file("build.sbt"))).toSeq
}
I hope I wasn't too late for the answer, but let's look at this section about Unmanaged vs Managed files
Classpaths, sources, and resources are separated into two main categories: unmanaged and managed. Unmanaged files are manually created files that are outside of the control of the build. They are the inputs to the build. Managed files are under the control of the build. These include generated sources and resources as well as resolved and retrieved dependencies and compiled classes.
It seems that the key difference between "Unmanaged vs Managed" is "Manually vs Automatically". Now, if we look at documentation for "generating files". We will notice immediately that it means "generating files automatically", since generating files will happen at sbt compile.
Compile / sourceGenerators += <task of type Seq[File]>.taskValue
It makes sense. Since anything that happened during sbt compile should be removed during sbt clean.
Now, from your code below, It seems that you were trying to generate an unmanaged source file (you were not using sourceGenerators, didn't you?), to the managed source file directory. The most obvious problem with this is, your source file will be removed every time you call sbt clean, so you have to run this task again to get this file back (worse, you have to run the task manually, opposed to having the sbt compile do it for you.), thus defeating your purpose of doing it manually once in a while.
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
To fix this, you have to manually generate files to unmanaged source, which is basically your source code directory (it depends -- mine is "/app"). Yet, you have to annotate it somehow that these files are generated by some means. My solution is something like:
val file = (scalaSource in Compile).value / "generated" / "SomeGenCode.scala"
Hope this help!

Prevent looping recompilation in SBT

I am using SBT to build my scala project. AFter the compilation of a submodule which sues fastOptJS, I need to push the compiled files to another module within the same project, I designed a custom command fastOptCopy to do so.
lazy val copyjs = TaskKey[Unit]("copyjs", "Copy javascript files to public directory")
copyjs := {
val outDir = baseDirectory.value / "public/js"
val inDir = baseDirectory.value / "js/target/scala-2.11"
val files = Seq("js-fastopt.js", "js-fastopt.js.map", "js-jsdeps.js") map { p => (inDir / p, outDir / p) }
IO.copy(files, true)
}
addCommandAlias("fastOptCopy", ";fastOptJS;copyjs")
However, when I enter into the sbt console and type
~fastOptCopy
it keeps compiling, copying, compiling, copying, ... in an infinite loop. I guess that because I am copying the files, it thinks that the sources have changed and retriggers compilation.
How can I prevent this?
You can exclude specified files from watchSources in sbt configuration
http://www.scala-sbt.org/0.13/docs/Triggered-Execution.html
watchSources defines the files for a single project that are monitored
for changes. By default, a project watches resources and Scala and
Java sources.
Here is a similar question:
How to not watch a file for changes in Play Framework
watchSources := watchSources.value.filter { _.getName != "BuildInfo.scala" }

How to avoid re-compiling generated source code

Generating boilerplate source code with sbt works fine:
sourceGenerators in Compile <+= sourceManaged in Compile map { srcDir =>
DslBoilerplate.generate(srcDir, Seq(
"path/to/a/definition/file"
))
}
When I run sbt compile this also compiles the generated source code files, thus producing some class files. I just don't want the generated source code to be re-compiled every time I re-compile the project during development.
So, from the class files I made a jar file and used this instead of the generated source/class files (I deleted those). This worked fine, now having access to the generated code through the jar file. Is there a way though to let sbt do the 4 steps (if needed?) in the initial project build?:
generate source code files
compile those files
create a jar from the produced class files
delete source and class files
(In this question they use the sbt.IO.jar method to create a jar but there they already have existing files...)
Or is there another better approach than making a jar to avoid re-compiling generated source code?
Update 1 (see update 2 below)
Thanks, Seth, for your answer! It worked great to avoid generating the source files with each project compilation since the cache now remembers that they have been created. I'll certainly use this feature, thanks!
But this was actually not what I had in mind with my original question. Sorry for not being clear enough. It might be clearer if we think of this as 2 transformations happening:
Input file ---1---> Source file (*.scala) ---2---> Target file (*.class)
where the transformations are
generation of source code (from some information in an input file) and
compilation of the generated source code
This all works fine when I compile the project with sbt compile.
But then if I "rebuild the project" (in IntelliJ), the generated source code (from the sbt compilation) will compile again, and that's what I want to avoid - but at the same time have access to that code. Is there any other way to avoid compilation than placing this code in a jar and then delete the source and target files?
So I tried to continue along that line of thought wrestling with sbt to make it create a source and target jar - still can't make the target jar. This is what I came up with so far (with help from this):
sourceGenerators in Compile += Def.task[Seq[File]] {
val srcDir = (sourceManaged in Compile).value
val targetDir = (classDirectory in Compile).value
// Picking up inputs for source generation
val inputDirs = Seq("examples/src/main/scala/molecule/examples/seattle")
// generate source files
val srcFiles = DslBoilerplate.generate(srcDir, inputDirs)
// prepare data to create jars
val srcFilesData = files2TupleRec("", srcDir)
val targetFilesData = files2TupleRec("", targetDir)
// debug
println("### srcDir: " + srcDir)
println("### srcFilesData: \n" + srcFilesData.mkString("\n"))
println("### targetDir: " + targetDir)
println("### targetFilesData: \n" + targetFilesData.mkString("\n"))
// Create jar from generated source files - works fine
val srcJar = new File("lib/srcFiles.jar/")
println("### sourceJar: " + srcJar)
sbt.IO.jar(srcFilesData, srcJar, new java.util.jar.Manifest)
// Create jar from target files compiled from generated source files
// Oops - those haven't been created yet, so this jar becomes empty... :-(
// Could we use dependsOn to have the source files compiled first?
val targetJar = new File("lib/targetFiles.jar/")
println("### targetJar: " + targetJar)
sbt.IO.jar(targetFilesData, targetJar, new java.util.jar.Manifest)
val cache = FileFunction.cached(
streams.value.cacheDirectory / "filesCache",
inStyle = FilesInfo.hash,
outStyle = FilesInfo.hash
) {
in: Set[File] => srcFiles.toSet
}
cache(srcFiles.toSet).toSeq
}.taskValue
def files2TupleRec(pathPrefix: String, dir: File): Seq[Tuple2[File, String]] = {
sbt.IO.listFiles(dir) flatMap {
f => {
if (f.isFile && f.name.endsWith(".scala")) Seq((f, s"${pathPrefix}${f.getName}"))
else files2TupleRec(s"${pathPrefix}${f.getName}/", f)
}
}
}
Maybe I still don't need to create jars? Maybe they shouldn't be created in the source generation task? I need help...
Update 2
Silly me!!! No wonder I can't make a jar with class files if I filter them with f.name.endsWith(".scala"), dohh
Since my initial question was not that clear, and Seth's answer is addressing an obvious interpretation, I'll accept his answer (after investigating more, I see that I should probably ask another question).
You want to use FileFunction.cached so that the source files aren't regenerated every time.
Here's an example from my own build:
Compile / sourceGenerators += Def.task[Seq[File]] {
val src = (Compile / sourceManaged).value
val base = baseDirectory.value
val s = streams.value
val cache =
FileFunction.cached(s.cacheDirectory / "lexers", inStyle = FilesInfo.hash, outStyle = FilesInfo.hash) {
in: Set[File] =>
Set(flex(s.log.info(_), base, src, "ImportLexer"),
flex(s.log.info(_), base, src, "TokenLexer"))
}
cache(Set(base / "project" / "flex" / "warning.txt",
base / "project" / "flex" / "ImportLexer.flex",
base / "project" / "flex" / "TokenLexer.flex")).toSeq
}.taskValue
Here the .txt and .flex files are input files to the generator. The actual work of generating the source files is farmed out to my flex method, which returns a java.io.File:
def flex(log: String => Unit, base: File, dir: File, kind: String): File =
...
You should be able to adapt this technique to your build.
FileFunction.cached is described in the API doc and in the sbt FAQ under "How can a task avoid redoing work if the input files are unchanged?" (http://www.scala-sbt.org/0.13/docs/Faq.html). (It would be nice if the material on caching was referenced from http://www.scala-sbt.org/0.13/docs/Howto-Generating-Files.html as well; currently it isn't.)

Where to acquire jar that provides scala.tools.nsc.MainGenericRunner

I my Lift project I have a file called LiftConsole.scala. It was generated by project creation script and contains following
import _root_.bootstrap.liftweb.Boot
import _root_.scala.tools.nsc.MainGenericRunner
object LiftConsole {
def main(args : Array[String]) {
// Instantiate your project's Boot file
val b = new Boot()
// Boot your project
b.boot
// Now run the MainGenericRunner to get your repl
MainGenericRunner.main(args)
// After the repl exits, then exit the scala script
exit(0)
}
}
It seems that the purpose of this file is to let user interact with console from within the project. I'd like that, but I was never able to do this because I cannot find a jar for MainGenericRunner. Does anyone know where to get it?
My goal is to be able to initialize console will all project settings so I can execute project specific code.
It is part of scala-compiler.jar. You can find it with the rest of the Scala distribution. Add this to your project:
val scalaCompiler = "org.scala-lang" % "scala-compiler" % "2.8.1"

Making stand-alone jar with Simple Build Tool

Is there a way to tell sbt to package all needed libraries (scala-library.jar) into the main package, so it is stand-alone? (static?)
Edit 2011:
Since then, retronym (which posted an answer in this page back in 2010), made this sbt-plugin "sbt-onejar", now in its new address on GitHub, with docs updated for SBT 0.12.
Packages your project using One-JARâ„¢
onejar-sbt is a simple-build-tool plugin for building a single executable JAR containing all your code and dependencies as nested JARs.
Currently One-JAR version 0.9.7 is used. This is included with the plugin, and need not be separately downloaded.
Original answer:
Directly, this is not possible without extending sbt (a custom action after the model of the "package" sbt action).
GitHub mentions an assembly task, custom made for jetty deployment. You could adapt it for your need though.
The code is pretty generic (from this post, and user Rio):
project / build / AssemblyProject.scala
import sbt._
trait AssemblyProject extends BasicScalaProject
{
def assemblyExclude(base: PathFinder) = base / "META-INF" ** "*"
def assemblyOutputPath = outputPath / assemblyJarName
def assemblyJarName = artifactID + "-assembly-" + version + ".jar"
def assemblyTemporaryPath = outputPath / "assembly-libs"
def assemblyClasspath = runClasspath
def assemblyExtraJars = mainDependencies.scalaJars
def assemblyPaths(tempDir: Path, classpath: PathFinder, extraJars: PathFinder, exclude: PathFinder => PathFinder) =
{
val (libs, directories) = classpath.get.toList.partition(ClasspathUtilities.isArchive)
for(jar <- extraJars.get ++ libs) FileUtilities.unzip(jar, tempDir, log).left.foreach(error)
val base = (Path.lazyPathFinder(tempDir :: directories) ##)
(descendents(base, "*") --- exclude(base)).get
}
lazy val assembly = assemblyTask(assemblyTemporaryPath, assemblyClasspath, assemblyExtraJars, assemblyExclude) dependsOn(compile)
def assemblyTask(tempDir: Path, classpath: PathFinder, extraJars: PathFinder, exclude: PathFinder => PathFinder) =
packageTask(Path.lazyPathFinder(assemblyPaths(tempDir, classpath, extraJars, exclude)), assemblyOutputPath, packageOptions)
}
It takes a bit of work, but you can also use Proguard from within SBT to create a standalone JAR.
I did this recently in the SBT build for Scalala.
Working off of what #retronym offered above, I built a simple example that builds a stand alone jar which includes the Scala libraries (i.e. scala-library.jar) using Proguard with sbt. Thanks, retronym.
The simplest example using sbt-assembly
Create directory project in your home project dir with file assembly.sbt including
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
In file build.sbt
import AssemblyKeys._ // put this at the top of the file
assemblySettings
jarName += "Mmyjarnameall.jar"
libraryDependencies ++= Seq( "exmpleofmydependency % "mydep" % "0.1" )
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case s if s.endsWith(".class") => MergeStrategy.last
case s if s.endsWith("pom.xml") => MergeStrategy.last
case s if s.endsWith("pom.properties") => MergeStrategy.last
case x => old(x)
}
}
The simplest method is just to create the jar from the command line. If you don't know how to do this I would highly recommend that you do so. Automation is useful, but its much better if you know what the automation is doing.
The easiest way to automate the creation of a runnable jar is to use a bash script or batch script in windows.
The simplest way in sbt is just to add the Scala libraries you need to the resource directories:
unmanagedResourceDirectories in Compile := Seq(file("/sdat/bins/ScalaCurrent/lib/scalaClasses"))
So in my environment ScalaCurrent is a link to the current Scala library. 2.11.4 at the time of writing. The key point is that I extract the Scala library but place it inside a ScalaClassses directory. Each extracted library needs to go into its top level directory.