Managing custom client builds with SBT - scala

We have an application that is extensible via modules. The (multi-project) SBT build produces a distribution artifact that is easy to deploy.
Some custom deployments for clients do require specific modules to be part of the build (in other words, an additional set of dependencies). I'm wondering what would be the best approach to create such custom builds - in other words, is there perhaps a way to extend the main Build and only add those dependencies?
Right now I am thinking of the following approach:
have the main application packaged (as ZIP) & released
in the custom build, fetch the ZIP file, extract it, magically add the additional JAR dependencies, and zip the artifact again ("magically" because I don't know how to get access to all JAR dependencies specified in the build)
But is there perhaps a more elegant way?

I think it would be easier to just declare a new sub-project along the main one that depends on it.
lazy val extra: Project = Project("extra", file("extra")) dependsOn(mainProject) settings(Seq(...))
Then on that package you can declare the extra dependencies. When you package this extra project everything should end up into the package automatically.

Related

sbt multi project: create resource in another sub-project

I have an sbt project with two sub-projects, A and B. A produces a standalone scala-based executable exe. When exe is run, it will produce a file out.xml. I want this file to be part of resources for project B. I do not want B to include any references to A's code, all I want is the out.xml file to be part of it. I suspect that http://www.scala-sbt.org/0.13.5/docs/Howto/generatefiles.html should be a good starting point, but I can't get my head around on how to split it between two projects. Any takers?
Since A is a dependency of the build process, which needs to run the executable to generate your xml file you would list it as a libraryDepencency in project/[something].sbt or project/project/[something].scala. This would make it available to code you put in build.sbt or project/[something].scala but not make it a transitive dependency of the resulting artifact of project B.
(Or you could of course make project A a sbt-plugin itself, or create yet another project which is a plugin depending on A that runs the executable.)

Is there any way to use SBT's resolver only for subset of artifacts?

Is there any way to define custom resolver that would be used only for subset of artifacts, more specifically to fetch artifacts only with predefined groupId?
For example, project defines a custom FooResolver that should be used only for artifacts with groupId org.foo but all other artifacts should be resolved using the default resolver.
To add unmanaged dependencies to an SBT project, the simplest solution is to just put the jars in the lib folder in your project. All libraries in the lib folder will be in the classpath by default.
If you want to use another folder instead of lib, you can redefine it:
unmanagedBase := // provide a java.io.File here.
If you want to do something more complex: SBT retrieves unmanaged libraries with the unmanagedJars task, so you can always redefine that task (but that would probably be a sign that you're trying to do something too complicated to reasonably use unmanaged dependencies...).

Tell SBT to not use staging area

I want to be able to compile my project once and pass it through multiple build steps on a CI server. But SBT puts files in a staging area like the one below.
/home/vagrant/.sbt/0.13/staging/
This means the project is not stand-alone and for every CI step it is going to compile it again.
How can I tell SBT to keep things simple and stand-alone and to make sure everything it needs is inside the project directory?
FYI, the staging area is used for the target files when the source folder is not read/write. Making the source folder read/write should fix this.
If you pass -Dsbt.global.staging=./.staging to sbt when starting it up, the staging directory will be .staging in the project's directory.
I figured that out by looking at the sbt source and patching that together with how Paul P's sbt runner passes the value for the sbt boot path.
If that doesn't accomplish what you want, then you might be able to make something go with a custom resolver. The sbt Build Loaders page talks about creating a custom resolver that lets you specify more detail about where dependencies are written. If my solution doesn't get you what you want, you'd probably need to do something like that.

Creating Java library file with IntelliJ IDEA

I'm trying to create a library which could be used in other projects. I've written one class with several static methods to do some stuff. I wanted to try it out but I am not able to use the imported JAR file.
I have compiled my code as an artifact and took the JAR file from "out" folder and then copied it to another project. After that I went to "Project structure", tab "Libraries" and I pressed the plus button. I've found the JAR file and selected it, afterwards IDEA asked me to specify dependencies so I did, but when I want to use it in code I am not able to do so. It can't even be imported.
Any ideas why it ignores my library? Thanks!
What should I do in order to create a JAR library with IntelliJ IDEA, that is usable in other projects?
You are running into a very common dependency management problem.
IMO the real answer is to use a build system like Maven, Ant, or Gradle (I'd go Gradle myself). What you are trying to do is manual, hard to reproduce, and brittle.
Every time you make a change you will have to go through manual steps to create a new JAR. And you really don't understand your dependencies.
To go all out with best practices you would be to have real build system that publishes to a continuous integration server, which compiles and runs tests. On successful completion of the tests, the JARs are published to an artifact server (Nexus/Artifactory).
The people you are sharing with would consume the JARs via the build system by declaring dependencies on your JAR.
I figured out what my problem was. When I created the library I was trying to make it simple. Too simple, unfortunately. I had a package with a class in it that was compiled into a JAR. Structure shown below:
foo
|
|_ MyLib.java
However in order to use classes from a created JAR library they have to be placed in packages. That means if I have:
foo
|
|_bar
| |
| |_MyInnerLib.java
|
|_MyOuterLib.java
I am able to import and use methods from MyInnerLib but MyOuterLib isn't reachable nor importable. This was the error I was making.

How can I execute several maven plugins within a single phase and set their respective execution order?

I would like to breakup certain phases in the maven life cycle into sub phases. I would like to control the execution flow from one sub-phase to another, sort of like with ant dependencies.
For example, I would like to use the NSIS plugin in order to package up my project into an installer at the package stage, AFTER my project had been packaged into a war file. I would like to do all that at the package phase.
Is that possible?
Plugins bound to the same phase should be executed in the same order as they are listed in the POM. Under certain circumstances (e.g. if you bind the same plugin to a phase twice, like the antrun plugin), this may not occur but this is a bug (see MNG-2258 and the related issue MNG-3719).
I had the same problem. look at How to perform ordered tasks in Maven2 build.
for some reason the different goals bound to a phase are stored in a hash map or other unordered structure which makes the execution order random.
my solution was to spread the tasks to different phases but I dont think there is much sence for it in your case (nsis packaging is not pre integration test).
you could do one of the following:
1) try your luck and see if Maven chosses the right order for you (you probably tried that already)
2) use standalone plugin - run the goal outside the lifecycle. something like:
mvn package org.codehaus.mojo:nsis-maven-plugin:1.0:compile.
3) separate them into module: have a parent pom containing two sub modules, one - your war project and the other for the nsis project.
4) use a custom lifecycle by changing the type, in your case you can use "exe". this is done by using a custom plugin extension (guide to using extension)
5) use the jetspeed-mvn-maven-plugin. I have never used it but it seems relevant to your needs.
hope this gives you new ideas.
Ronen