sbt, couple questions: dependsOn - scala

I have some questions about SBT:
1) I'm wondering why there is an option to use 'dependOn' clause. I fully understand that it joins projects.
lazy val projectA = Project("A", file("a"))
lazy val projectB = Project("B", file("b")).dependsOn(projectA)
What i don't like in this code: you can't specify version of projectA in projectB. it always aggregates latest condition of projectA. Why to split your application to multiproject if every subproject is tightly coupled with each other?
There is an another option. We can publish subproject in binary repository with version and add it as dependency in settings.
Why not to use this code:
lazy val projectA = project("A", file("a"))
lazy val projectB = Project("B", file("b")).settings(libraryDependencies ++= Seq("groupOfA" %% "A" % "versionOfA"))
Of couse, you need to have binary repository for this. But it's not a problem, you can install nexus locally (it supports almost everything and free to use), or use oss.sonatype.org.
2) This question is related to the first question, i don't understand why there is 'publishLocal' task. As i know SBT uses Ivy2 repository, but when you publish your project to nexus or oss.sonatype.org you publish it to maven2 repo. And the problem occurs when sbt detects locally published and cached from maven. It throws errors. I think this is sbt bug (https://github.com/sbt/sbt/issues/2687). I don't use publishLocal anymore, i don't understand why not to have installed binary repository on your machine if you want to split your application into mulpiple components.

As you noted, libraryDependencies is strictly more powerful than dependsOn for multi-project management, at the cost of increased complexity.
You don't even need to install a separate binary repository, your local repo is quite good enough to publish to with publishLocal.
Which brings us to your next question, that is why use publishLocal when it publishes to Ivy by default? Two things: first, set up the local publish to publish in Maven style: http://www.scala-sbt.org/0.13/docs/Publishing.html#Modifying+the+generated+POM (publishMavenStyle := true).
Second, regarding the problem when you have the same version locally published and cached from Maven Central. Short answer: don't do that. If you publish to Maven Central, you should be using local publishes for testing only, and should publishLocal only 'SNAPTSHOT' versions. You should be publishing only fixed version numbers to Maven Central. Then there's no conflict. It's how Maven was designed; version numbers should be immutable and 'SNAPSHOT's should be for testing only.

Related

Per project sbt repositories/config

I work with a variety of SBT projects and not all of them share the same repositories/resolvers. I would like to be able to store the repositories configuration inside each project so that I can also build other people's projects without overriding the default (or theirs) repositories and sbt configuration.
Using the ~/.sbt/repositories is not an option since that is per user and not per project. I have tried passing parameters to sbt and that works; e.g. sbt compile -Dsbt.boot.properties=build/sbt.boot.properties. However, that requires people to remember this flag and type it/alias it every time.
Is there any way to get sbt to read configuration, or flags like the above, from its current directory? Thanks!
Have you tried http://www.scala-sbt.org/0.13/docs/Resolvers.html?
Resolvers for Maven2 repositories are added as follows:
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
This is the most common kind of user-defined resolvers. The rest of
this page describes how to define other types of repositories.

Is it possible to publish the build project itself of a project using sbt?

I have a downstream project which would like to reference values defined inside the build files of an upstream project. I was thinking if there is an easy way to publish a jar with the source files of the build project itself then I could publish the build project of the upstream project and the build project of the downstream repo could depend on the build project of the upstream repo. Is it possible to do this? Is it reasonable? Are there other, potentially better solutions?
To be clear because I can see how the above might be confusing (and pardon me if I am using incorrect terminology), I am referring to the recursive nature of sbt builds and the fact that the build definition for a project is a project in itself and that's what I would like to publish, not the source files of the project itself.
I'm familiar with writing plugins in sbt and with the sbt buildinfo plugin. I'm hoping there's another way.

SBT Resolver for plugin in local directory on Heroku

I have an internal SBT plugin which sets up a lot of common aspects of my build. One of those is my setup to add my Artifactory credentials and resolvers. I typically publish the plugin locally so that my build can resolve it and then pull the remaining dependencies from my Artifactory repositories.
For deploying to Heroku, I planned to copy the published artifacts from my .ivy2 repo to a subfolder of the project. However, though I can get this to work locally using both Resolver.file and Resolver.url, I cannot get this to work once I push to Heroku. I even tried it as an unmanaged dependency but still unresolved in Heroku.
Does anyone know the magic spell for achieving this on Heroku?
I have attempted following in project/plugins.sbt:
Resolver.url.("local-plugins", url(s"file///${baseDirectory.value}/plugins"))(Resolver.ivyStylePatterns)
Resolver.file("local-plugins", file("plugins")(Resolver.ivyStylePatterns)
unmanagedBase := baseDirectory.value / "lib"
I recommend two different approaches:
Use the sbt-heroku plugin instead of deploying with Git. You can use local dependencies this way.
Deploy your plugin to BinTray. It's fairly easy, free, and worth the trouble.
I revisited this again today and the following worked:
In plugins.sbt:
resolvers += Resolver.file("local-plugins", file("local-plugins"))(Resolver.ivyStylePatterns)
My project also contains a local-plugins/ directory in the working folder which holds the published artifacts.

shapeless port to scala-js: create artifact with few external dependencies

There is a port of shapeless library to scala-js (https://github.com/alexander-myltsev/shapeless). I need to publish artifact properly with fewer possible dependencies to original shapeless.
Now I forked Miles Sabin's repo, and added changes required to generate scala-js library: add scalajs-sbt-plugin, tune build.scala, add bintray-sbt-plugin.
It is wrong to ask shapeless maintainers to merge my branch because scala-js could broke their build.
On the other hand I'd like to have minimal dependencies to original repo as well. Theoretically and ideally what I'd like is to create, say, shapeless-scalajs sbt-project from scratch. Then reference somehow original shapeless library. And then derive from shapeless-build-scala with required overrides to build it against scala-js and publish to my bintray.
I believe in almighty sbt :) What are my options to solve this task?
I think the easiest way is (no sbt hackery involved):
Fork shapeless
Create Scala.js branch
Change build files as you need. That is, modify the shapelessCore project directly as in your PR. (add scalaJSSettings, your repo coordinates)
Commit
Publish shapeless to your maven central
When a new version of shapeless comes out, just merge shapeless/master with your scala.js branch. If no changes happened to the build file, this will merge just fine.
Re-publish
This will be way easier than an sbt project that depends on an external project (which is doable, but doesn't directly allow you to reuse settings etc.)

Can sbt be used to access a none-scala github repo to read into a scala project?

I'm dealing with two repos:
- A github repo that contains a bunch of text files.
- A scala project that would like to read those text files.
I would like to use SBT to download the contents of the github repo as a build dependency.
I wouldn't mind if SBT supplied either a path (into the ivy repo?) for the project to use or build them into the projects available resources - or any other way that will just work. I'm aiming for something automatic; clearly there are ways I could do this manually.
If you talk about a bunch of text files like *.property for example that used as dependency for your project (do you really want download them every time?) you may use sbt.IO.download(url: URL, to: File). Just create task and add to project definition compile <<= (compile in Compile) dependsOn myDownloadTask After that you may process them as regular local files ;-).
IMHO you understand that you may add custom logic like caching or page parsing or REST request to GitHub to your project definition. At last you may create your own SBT plugin - there are few video tutorials "How to create SBT plugin in 5 minutes" on YouTube.