sbt multi project: create resource in another sub-project - scala

I have an sbt project with two sub-projects, A and B. A produces a standalone scala-based executable exe. When exe is run, it will produce a file out.xml. I want this file to be part of resources for project B. I do not want B to include any references to A's code, all I want is the out.xml file to be part of it. I suspect that http://www.scala-sbt.org/0.13.5/docs/Howto/generatefiles.html should be a good starting point, but I can't get my head around on how to split it between two projects. Any takers?

Since A is a dependency of the build process, which needs to run the executable to generate your xml file you would list it as a libraryDepencency in project/[something].sbt or project/project/[something].scala. This would make it available to code you put in build.sbt or project/[something].scala but not make it a transitive dependency of the resulting artifact of project B.
(Or you could of course make project A a sbt-plugin itself, or create yet another project which is a plugin depending on A that runs the executable.)

Related

Fine-grained builds with dynamic dependencies?

I am interested in understanding whether bazel can handle "two stage builds", where dependencies are discovered based on the file contents and dependencies must be compiled before the code that depends on them (unlike C/C++ where dependencies are mostly header files that are not separately compiled). Concretely, I am building the Coq language which is like Ocaml.
My intuition for creating a build plan would use an (existing) tool (called coqdep) that reads a .v file and returns a list of all of its direct dependencies. Here's the algorithm that I have in mind:
invoke coqdep on the target file and (transitively) on each of its dependent files,
once transitive dependencies for a target are computed, add a rule to build the .vo from the .v that includes transitive dependencies.
Ideally, the calls to coqdep (in step 1) would be cached between builds and so only need to be re-computed when the file changes. And the transitive closure of the dependency information would also be cached.
Is it possible to implement this in bazel? Are there any pointers to setting up builds for languages like this? Naively, it seems to be a two-stage build and I'm not sure how this fits into bazel's compilation model. When I looked at the rules for Ocaml, it seemed like it was relying on ocamlbuild to satisfy the build order and dependency requirements rather than doing it "natively" in bazel.
Thanks for any pointers or insights.
(don't have enough rep to comment yet, so this is an answer)
#2 of Toraxis' answer is probably the most canonical.
gazelle is an example of this for Golang, which is in the same boat: dependencies for Golang files are determined outside a Bazel context by reading the import statements of source files. gazelle is a tool that writes/rewrites Golang rules in BUILD files according to the imports in source files of the Bazel workspace. Similar tools could be created for other languages that follow this pattern.
but the generated BUILD file will be in the output folder, not in the source folder. So you also have to provide an executable that copies the files back into the source folder.
Note that binaries run via bazel run have the environment variable BUILD_WORKSPACE_DIRECTORY set to the root of the Bazel workspace (see the docs) so if your tool uses this environment variable, it could edit the BUILD files in-place rather than generating and copying back.
(In fact, the generating-and-copying-back strategy would likely not be feasible, because purely-generated files would contain only Coq rules, and not any other types of rules. To generate a BUILD file with Coq rules from one with other types of rules, one would have to add the BUILD files themselves as dependencies - which would create quite the mess!)
I'm looking into similar questions because I want to build ReasonML with Bazel.
Bazel computes the dependencies between Bazel targets based on the BUILD files in your repository without accessing your source files. The only interaction you can do with the file system during this analysis phase is to list directory contents by using glob in your rule invocations.
Currently, I see four options for getting fine-grained incremental builds with Bazel:
Spell out the fine-grained dependencies in hand-written BUILD files.
Use a tool for generating the BUILD files. You cannot directly wrap that tool in a Bazel rule to have it run during bazel build because the generated BUILD file would be in the output folder, not in the source folder. But you can run rules that call coqdep during the build, and provide an executable that edits the BUILD file in the source folder based on the (cacheable) result of the coqdep calls. Since you can read both the source and the output folder during the build, you could even print a message to the user if they have to run the executable again. Anyway, the full build process would be bazel run //tools/update-coq-build-files && bazel build to reach a fixed point.
Have coarse-grained dependencies in the BUILD files but persistent workers to incrementally rebuild individual targets.
Have coare-grained dependencies in the BUILD files but generate a separate action for each target file and use the unused_inputs_list argument of ctx.actions.run to communicate to Bazel which dependencies where actually unused.
I'm not really sure whether 3 and 4 would actually work or how much effort would be involved, though.

Organizing files in a SBT-based scala project

Newcomer to the Intellij IDE here, with no Java background. I've looked at Build Definition to get a brief idea on how should I organize my scala files, but their example doesn't cover the full structure of an SBT-based project shown attached.
Can you advise what each folder should be used for (e.g. where my source files should go, etc.) and also point me to sources where I can go read up more.
Thanks so much
It is described pretty well here:
http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Directories.html
But to sum up.
.idea:
This contains the project files for your idea project, and has nothing directly to do with sbt itself. However idea (if auto refresh is enabled) updates its own project, each time the sbt build files change.
project:
This contains the sbt project files, except for the main build file (files ending in .sbt). Sbt build is itself based on scala, and if you need to have some scala code included in your build (e.g., code-generation/meta-programming, pre-compiler macros), then you can place scala source files in this directory. The code of these files can be used in your build system, and is not part of your project itself. To really understand how a build is made, then you will need to understand the difference in how sbt files and scala files for the build should be placed. When you run sbt, then it will search for .sbt files in the directory your are standing in, when these are found, it will search for scala files in the project directory. These files together are the source of the build system, but because these are source files, they need to be built before they can be used. To build this build system, sbt uses sbt. So a build system to build the build system is needed. It therefore looks for sbt files inside the project directory, and scala files for this build inside project/project and build these files to get a build system, that can build the build system (that can build your project). Actually it can continue recursive down to any project/project/project... directory, until it finds a project folder containing no scala files, and therefore needs no building before use.
The target folder inside project, is the target folder for the sbt build of your build definition. See below what a target folder is.
Normally you would not need to be concerned about this; just remember that build.sbt in your root directory is the build script for your project. project/plugins.sbt defines plugins activated for your build system, and project/build.properties contains special sbt properties. Currently the only sbt property I now of, is what version of sbt should be used.
src:
This is where your place the source files of your project. You should place any java sources in src/main/java, scala sources in src/main/scala. Resources are placed in src/main/resources.
The src/main/scala_2.11 folder is typically used, if you have some code that it not binary compatible with different versions of scala. In such cases you would be able to configure sbt to use different source files when building for different versions of scala. You probably do not need this, so I would advise to just delete the src/main/scala_2.11 folder.
Test sources are placed inside src/test/java and source/test/scala, and test resources are placed in src/test/resources.
target
This folder is the target folder for sbt. All compiled files, generated packages and so on are placed somewhere inside this dir.
Most things in this dir are not so interesting, as most of it is just internal sbt things. However if your build a jar file by calling sbt package, then it will be placed inside target/scala-x (where x is the scala version). There are also a lot of different plugins, that can package your application in different ways, and they will normally also place the package files somewhere inside the target dir.

Tell SBT to not use staging area

I want to be able to compile my project once and pass it through multiple build steps on a CI server. But SBT puts files in a staging area like the one below.
/home/vagrant/.sbt/0.13/staging/
This means the project is not stand-alone and for every CI step it is going to compile it again.
How can I tell SBT to keep things simple and stand-alone and to make sure everything it needs is inside the project directory?
FYI, the staging area is used for the target files when the source folder is not read/write. Making the source folder read/write should fix this.
If you pass -Dsbt.global.staging=./.staging to sbt when starting it up, the staging directory will be .staging in the project's directory.
I figured that out by looking at the sbt source and patching that together with how Paul P's sbt runner passes the value for the sbt boot path.
If that doesn't accomplish what you want, then you might be able to make something go with a custom resolver. The sbt Build Loaders page talks about creating a custom resolver that lets you specify more detail about where dependencies are written. If my solution doesn't get you what you want, you'd probably need to do something like that.

Managing custom client builds with SBT

We have an application that is extensible via modules. The (multi-project) SBT build produces a distribution artifact that is easy to deploy.
Some custom deployments for clients do require specific modules to be part of the build (in other words, an additional set of dependencies). I'm wondering what would be the best approach to create such custom builds - in other words, is there perhaps a way to extend the main Build and only add those dependencies?
Right now I am thinking of the following approach:
have the main application packaged (as ZIP) & released
in the custom build, fetch the ZIP file, extract it, magically add the additional JAR dependencies, and zip the artifact again ("magically" because I don't know how to get access to all JAR dependencies specified in the build)
But is there perhaps a more elegant way?
I think it would be easier to just declare a new sub-project along the main one that depends on it.
lazy val extra: Project = Project("extra", file("extra")) dependsOn(mainProject) settings(Seq(...))
Then on that package you can declare the extra dependencies. When you package this extra project everything should end up into the package automatically.

How to get Eclipse to create bin/main and bin/test

I want my Ant build to take all Java sources from src/main/*, compile them, and place them inside bin/main/. I also want it to compile src/test/* sources to bin/test/. I wan this behavior because I want to package the binaries into different JARs, and if they all just go to a single bin/ directory it will be impossible* (extremely difficult!) to know which class files belong where.
When I go to configure my build path and then click the Source tab I see an area toward the bottom where it reads Default output folder: and then allows you to browser for its location.
I'm wondering how to create bin/main and bin/test in an existing project without "breaking" Eclipse (it happens). I'm also wondering that if I just have my Ant build make and delete those directories during the clean-n-build process, that Eclipse might not care what the default output is set to. But I can't find any documentation either way.
Thanks in advance for any help here.
In Eclipse, you can only have one output folder per project for your compiled Java files. So you cannot get Eclipse to do the split you want (i.e. compile src/main to bin/main and src/test to bin/test).
You can, if you want, create two Eclipse projects, one main project and one test project, where the test project depends on (and tests) the main project. However, in that case, each project should be in its own directory structure, which is not what you are asking for. But this is a common approach.
Another way, which I would recommend, would be to not mix Ant compilation and Eclipse's compilation. Make the Ant script the way you describe (i. e. compile the main and test directories separately and create two separate jar files). Change the Eclipse compile directory to something different, for instance bin/eclipse. Use the Ant script when making official builds or building for release. Use Eclipse's building only for development/debugging. This way, your two build systems will not get in each other's way and confuse each other.
Hope this answers your question and I understood it correctly. Good luck!