How to configure build.sbt so that xsbt-web-plugin a creates war file without compression? - scala

I am using Scala 2.10.1 with sbt to package my webapp as a war file.
For the purpose of efficient rsync deltas, I'd like to have the war packaged as a .war file, but without zip compression. I just need to know how to configure my build for this.
UPDATE:
All these plugin docs assume all this knowledge of how the syntax works and how to combine tasks into a new task, etc. I can't even tell how to create a new task that does package then command. None of the answers so far have said specifically, "here's what you do.."
Just to be clear, this is all I'm asking for:
I need a Task "packnozip" that does this:
1) run "package"
2) run shell commands:$ mkdir ./Whatever
$ pushd ./Whatever
$ jar xvf ../Whatever.war
$ popd
$ mv ./Whatever.war ./Whatever.war.orig
$ jar cvM0f ./Whatever.war -C ./Whatever .
So what i'm saying is i want to type "packnozip" into the sbt console and have it do #1 then #2.
For now i'm just manually doing #2 which seems silly if it can be automated.
Also watching a 30MB file get completely resent by rsync b/c it is not diffable seems quite silly when a 34MB uncompressed file is only 13% more data, and takes a fraction of second to send b/c of efficient diffs, not to mention "-z" will compress the transfer anyways.

If you have your war file unzipped in a directory you can:
zip -r -0 project.war project/
That should be zero compression. In case you don't see those options, this is my setup:
[node#hip1 dev]$ zip -v
Copyright (c) 1990-2008 Info-ZIP - Type 'zip "-L"' for software license.
This is Zip 3.0 (July 5th 2008), by Info-ZIP.
Which, you could execute as a run task I believe, after the war is packaged.
UPDATE 1
I believe this is the best way to achieve your needs:
http://www.scala-sbt.org/release/docs/Detailed-Topics/Process
val exitcode = "zip -r -0 project.war project/"!
However, if you need to work from a specific directory (Please see Update 2 below):
Modified this to execute within directory but place .war above directory. The path (2nd) argument should include the directory, so that the zip is performed inside of it:
Process("zip" :: "-r" :: "-0" :: "../project.war" :: "." :: Nil, "/path/to/project/") !
Here's another SO question on the ProcessBuilder that may help as well:
How does the “scala.sys.process” from Scala 2.9 work?
(Note: you don't need to import scala.sys.process._)
UPDATE 2
For readers in the future, please note that zipping the project directory itself will not work, one needs to perform the zip of the war inside the directory by using pushd, putting the resulting war outside of the directory as mentioned by the OP in the comments below this answer. As Orange80 mentioned:
pushd ./project && zip -r -0 ../project.war ./ && popd
UPDATE 3
Check out this, it may do exactly what you need, with a 0 for options to specify no compression:
https://github.com/sbt/sbt-onejar
a plugin that lets you create a single executable jar, which, with options (for example "0" as in a command like "jar 0f blah.jar blah/") can be made I think as you mentioned in the comments below to create the jar file without compression.
For usage I found this on SO:
SBT one-jar plugin
And also, if it needs to be modified, it's a pretty reasonable example of a plugin as well, which if you drop it in your home ~/.sbt/plugins it will be global and can be used in your build in the fashion noted in the SO answer above. I hope that helps at least a little bit/

There is no way to do this directly via sbt configuration, since sbt assumes that any files within zip and jar artifacts should be compressed.
One workaround is to unzip and re-zip (without compression) the war file. You can do this by adding the following setting to your project (e.g. in build.sbt):
packageWar in Compile <<= packageWar in Compile map { file =>
println("(Re)packaging with zero compression...")
import java.io.{FileInputStream,FileOutputStream,ByteArrayOutputStream}
import java.util.zip.{CRC32,ZipEntry,ZipInputStream,ZipOutputStream}
val zis = new ZipInputStream(new FileInputStream(file))
val tmp = new File(file.getAbsolutePath + "_decompressed")
val zos = new ZipOutputStream(new FileOutputStream(tmp))
zos.setMethod(ZipOutputStream.STORED)
Iterator.continually(zis.getNextEntry).
takeWhile(ze => ze != null).
foreach { ze =>
val baos = new ByteArrayOutputStream
Iterator.continually(zis.read()).
takeWhile(-1 !=).
foreach(baos.write)
val bytes = baos.toByteArray
ze.setMethod(ZipEntry.STORED)
ze.setSize(baos.size)
ze.setCompressedSize(baos.size)
val crc = new CRC32
crc.update(bytes)
ze.setCrc(crc.getValue)
zos.putNextEntry(ze)
zos.write(bytes)
zos.closeEntry
zis.closeEntry
}
zos.close
zis.close
tmp.renameTo(file)
file
}
Now when you run package in sbt, the final war file will be uncompressed, which you can verify with unzip -vl path/to/package.war.

Related

Custom deployment with sbt

I have a Scala application and i want to setup a deployment process result similar to one Akka sbt plugin gives, but i it requires project to be a microkernel. Sbt-assembly is a very cool plugin, but i want to have a free access to my config files, basically i want to have the following app structure:
/app/bin - start script bash file
/config - all my application .conf files
/deploy - my project .jar with classes
/lib - all my dependencies .jar files
/logs - log files
I we checked typesafe sbt-native-packager, it's better, but it could place my .conf file and logs out of jars. All those settings in SBT looks confusing to me, what can i do to with this?
Actually this is not hard to update akka-sbt-plugin to make it work in general, add this file to your project folder and somewhere in your build the following settings:
.settings(Distribution.distSettings: _*)
.settings(mappings in Compile in packageBin ~= { _.filter(!_._1.getName.endsWith(".conf")) })
The first line adds all dist settings to your project and the second one excludes all .conf files from your .jar and reads them from config folder.
Now you have to commands: dist - creates a folder with a structure you've discribed and zipDist which packs it all into .zip file, later you can add this to you publish setting:
addArtifact(Artifact(someName, "zip", "zip"), zipDist)

Watch for project files also

I use sbt in the following fashion: I run ~ test:compile in sbt and then work in IDE, watching occasionaly if the project still compiles, because the IDE's presentation compiler tends to be buggy. When I git pull some code, there might be changes in the project/ files, so I want to have reload. Is there a way, how to watch both source files and project files, so when there is change in project files, I actually get the update?
As jsuereth explained this isn't a task SBT can handle in 1 instance. What's required is a reboot of SBT to abort the watching process and reload it's own configuration.
The following Scala script does exactly this, it uses Java NIO WatchService and Scala Process to monitor a path and restart a process. The code should be fairly simple to understand:
#!/usr/bin/env scala
import java.nio.file._
import scala.collection.JavaConversions._
import scala.sys.process._
val file = Paths.get(args(0))
val cmd = args(1)
val watcher = FileSystems.getDefault.newWatchService
file.register(
watcher,
StandardWatchEventKinds.ENTRY_CREATE,
StandardWatchEventKinds.ENTRY_MODIFY,
StandardWatchEventKinds.ENTRY_DELETE
)
def exec = cmd run true
#scala.annotation.tailrec
def watch(proc: Process): Unit = {
val key = watcher.take
val events = key.pollEvents
val newProc =
if (!events.isEmpty) {
proc.destroy()
exec
} else proc
if (key.reset) watch(newProc)
else println("aborted")
}
watch(exec)
Usage in your sbt dir would be:
watchr.scala project/ "sbt ~ test:compile"
If anything is unclear don't hesitate to ask, of course any scripting language could be used to implement this behavior.
You actually can't use ~ <task> and have it rebuild the project itself right now, because ~ <task> needs to read the build definition itself to determine:
What source files to watch
How to run the task.
What you're doing is altering the config whe project/ changes. This requires a full reload or reboot of sbt to pull in the new configuration.
So, as of sbt 0.13, this isn't possible. You can have it so it will rebuild your source code when project/ changes, but without rebuilding the build definition, not much help.
You could create a new sbt prompt, or task, that when run could check to see if source files in project/ are updated and issue a warning/error so you know to reboot. That's probably the best option right now.

PlayFramework2 Scala File Map

I'm just starting with Scala and have run into a problem that has me stumped, but I'm guessing that I'm missing something easy.
I was following instructions to use the Clapper ClassFinder:
http://thoughts.inphina.com/2011/09/15/building-a-plugin-based-architecture-in-scala/
val classpath = List("./plugins").map(new File(_))
val finder = ClassFinder(classpath)
val classes = finder.getClasses
val classMap = ClassFinder.classInfoMap(classes)
After executing the first line, I see that classpath is set simply to
List(.\plugins)
I'm running this on windows, so the swapping of the slash seems to be OK.
But I expected to see a list of File objects, although I am not sure about this Scala syntax, and perhaps I'm missing something in the Scala IDE. The value for classes shows an "empty iterator".
It seems not to be finding any files in the path that I specified. I tried using an absolute path, but I had the same results. I have a single jar file in the plugins directory that I'm hoping it will find. The plugins directory is at the root of the Play2 project I'm using.
Edit ---
I did find that when I explicitly list the path to one jar that it is able to find it:
val classpath = List("./plugins/myPlugin.jar").map(new File(_))
But I want to find all jar files in the directory.
The following didn't work:
val classpath = List("./plugins/*").map(new File(_))
Nor did this:
val classpath = List("./plugins/*.jar").map(new File(_))
Judging by this issue on the ClassFinder repo on Github it may be a bug.
I think you need to create an explicit list of jar files or to list the ones contained in your folder like:
val classpath =(new File("./plugins")).listFiles.filter(_.getName.endsWith(".jar"))
EDIT: from a cursory glance at ClassFinder's source on GitHub I think it's not a bug. ClassFinder searches for .class files either in jars or in zip files or directly in folders but it looks like it does not mix these things recursively (i.e. if you give it a folder it will look for classes directly in the folder but it won't look for classes in jars in the folder)
if you objective is to list all jar files, you can use following code:
val classpath = List("./plugins").map(path => Option(new File(path).listFiles).getOrElse(Array.empty[java.io.File]) filter(file => file.isFile && file.getName.endsWith(".jar"))).flatten

scala :How to read file in jar

I have directory structure like this
src
main
resources
text.txt
scala
hello
world.scala
test
same as main folder
pom.xml
When in IDE (Intellij10), I could access it with relative path ("src/main/resource/text.txt") but it seems I can not do that when I compile in jar. How to read that file ?
also, I found that test.txt is copy into root of jar. Is this normal behavior ? Since I fear this will be clash with other resources file in src/test/resources.
thanks
From http://www.java-forums.org/advanced-java/5356-text-image-files-within-jar-files.html -
Once the file is inside the jar, you cannot access it with standard FileReader streams since it is treated as a resource. You will need to use Class.getResourceAsStream().
The test.txt being copied into the root is not normal behavior and is probably a setting with your IDE.
8 years later, I am also facing the same question. To ease the life of future developers, here is the answer:
Being copied into the root is normal behaviour, as:
the resources folder is like a src folder and so the content is
copied, not the folder itself.
Now concerning the how-to question:
import scala.io.Source
val name = "text.txt"
val source: Source = Source.fromInputStream(getClass.getClassLoader.getResourceAsStream(name))
// Add the new line character as a separator as by getLines removes it
val resourceAsString: String = source.getLines.mkString("\n")
// Don't forget to close
source.close

R CMD check complains about unexpected files in man

this sounds like a silly problem: I'm putting my R code into a package and R CMD check src complains about the .Rd~ backup files being produced by Emacs.
* checking package subdirectories ... WARNING
Subdirectory 'man' contains invalid file names:
read.PI.Rd~ write.PI.Rd~
the documentation says: »In addition [...] files [...] with base names [...] ending in ‘~’, ‘.bak’ or ‘.swp’, are excluded by default.« (page 18). but then why the warning?
Just add a file cleanup which removes them in your top-level directory. Also, you could build a tarball or zip archive first via R CMD build and the check this archive via R CMD check -- that should skip these filese as well.
Also, exactly how are you calling R CMD check, and what is your directory layout? With R 2.10.0 on Linux, I just ran touch pkg/man/foo.Rd~ for one of my packages, and R CMD check pkg (where pkg is the top-level directory as common for source projects stored on R-Forge)
did not issue this warning you are seeing. The file was not removed by cleanup as that currently purges only in src.