Using a custom class loader for a module dependency in SBT - scala

I have a multi-module SBT build consisting of api, core and third-party. The structure is roughly this:
api
|- core
|- third-party
The code for third-party implements api and is copied verbatim from somewhere else, so I don't really want to touch it.
Because of the way third-party is implemented (heavy use of singletons), I can't just have core depend on third-party. Specifically, I only need to use it via the api, but I need to have multiple, isolated copies of third-party at runtime. (This allows me to have multiple "singletons" at the same time.)
If I'm running outside of my SBT build, I just do this:
def createInstance(): foo.bar.API = {
val loader = new java.net.URLClassLoader("path/to/third-party.jar", parent)
loader.loadClass("foo.bar.Impl").asSubclass(classOf[foo.bar.API]).newInstance()
}
But the problem is that I don't know how to figure out at runtime what I should give as an argument to URLClassLoader if I'm running via sbt core/run.

This should work, though I didn't quite tested it with your setup.
The basic idea is to let sbt write the classpath into a file that you
can use at runtime. sbt-buildinfo
already provides a good basis for this, so I'm gonna use it here, but you
might extract just the relevant part and not use this plugin as well.
Add this to your project definition:
lazy val core = project enablePlugins BuildInfoPlugin settings (
buildInfoKeys := Seq(BuildInfoKey.map(exportedProducts in (`third-party`, Runtime)) {
case (_, classFiles) ⇒ ("thirdParty", classFiles.map(_.data.toURI.toURL))
})
...
At runtime, use this:
def createInstance(): foo.bar.API = {
val loader = new java.net.URLClassLoader(buildinfo.BuildInfo.thirdParty.toArray, parent)
loader.loadClass("foo.bar.Impl").asSubclass(classOf[foo.bar.API]).newInstance()
}
exportedProducts only contains the compiled classes for the project (e.g. .../target/scala-2.10/classes/). Depending on your setup, you might want to use fullClasspath instead
(which also contains the libraryDependencies and dependent projects) or any other classpath related key.

Related

sbt how to access base directory of project in scala code

I have been given a code which was created by a vendor and seems like their engineer did a lot of hardcoding in the unit tests.
I have a unit test for a function which outputs the full absolute path of report generated as part of the code as a string.
currently the unit test/assertion that fails looks like
val reportPath = obj.getReportPath()
assert(reportPath.equals("file:/Users/khalid.mahmood/ReportingModule/target/report.csv")
where ReportingModule is the name of the project.
The code logic is fine as for me the value of the reportPath variable comes out to be:
file:/Users/vikas.saxena/coding_dir/ReportingModule/target/report.csv
Since I have the project cloned in a subdirectory called coding_dir in my home directory so the logic looks fine to me.
I want to modify the assertion to ensure that the code pics up the base directory of project by itself and on googling I found that sbt has base as the equivalent of project.baseDir (from maven) from this link
However the following code changes haven't worked out for me
assert(reportPath.equals(s"""$base""" + "/target/report.csv")
Can I get some pointers on how to get this right.
If you're using ScalaTest, you can the ConfigMap to do it.
First you need to tell the ScalaTest Runner to add the path to the ConfigMap. This can be done in your .sbt file like so:
Test / testOptions += Tests.Argument(
TestFrameworks.ScalaTest, s"-DmyParameter=${baseDirectory.value}")
(note that it doesn't have to be baseDirectory.value, many other sbt settings will work. I would suggest target.value for your specific use case).
In the test itself, you then need to access the value from the ConfigMap. The easiest way to do this is to use a Fixture Suite (such as FixtureAnyFunSuite) and mix in the ConfigMapFixture trait:
import org.scalatest.funsuite.FixtureAnyFunSuite
import org.scalatest.fixture.ConfigMapFixture
class ExampleTest extends FixtureAnyFunSuite with ConfigMapFixture {
test("example") { configMap =>
val myParameter = configMap.getRequired[String]("myParameter")
// actual test logic goes here
succeed
}
}
There are of course other ways to solve the problem. For instance, you can also simply get the current working directory (cwd) and work from there. However the downside to that is that in multi-module builds, the cwd will be different depending on whether the Test / fork setting in sbt is true or false. So to make your code robust against these sorts of eventualities, I recommend sticking with the ConfigMap way.

How to easily play around with the classes in an Scala/SBT project?

I'm new to Scala/SBT and I'm having trouble understanding how to just try out the classes and functions of a package to see what they're about, to get a feel for them. For example, take https://github.com/plokhotnyuk/rtree2d . What I want to do is something like (in the top level folder of the project)
# sbt
> console
> import com.github.plokhotnyuk.rtree2d.core._
...
etc. But this won't work as it can't find the import even though this is in the project. I apologize for the vague question, though I hope from my hand-waving it's clear what I want to do. Another way to put it maybe, is that I'm looking for something like the intuitive ease of use which I've come to take for granted in Python -- using just bash and the interpreter. As a last resort I can create a separate project and import this package and write a Main object but this seems much too roundabout and cumbersome for what I want to do. I'd also like if possible to avoid IDEs, since I never really feel in control with them as they do all sorts of things behind the scenes in the background adding a lot of bulk and complexity.
rtree2d takes advantage of sbt's multi-module capabilities; a common use for this is to put the core functionality in a module and have less core aspects (e.g. higher-level APIs or integrations with other projects) in modules which depend on the core: all of these modules can be published independently and have their own dependencies.
This can be seen in the build.sbt file:
// The main project -- LR
lazy val rtree2d = project.in(file("."))
.aggregate(`rtree2d-coreJVM`, `rtree2d-coreJS`, `rtree2d-benchmark`)
// details omitted --LR
// Defines the basic skeleton for the core JVM and JS projects --LR
lazy val `rtree2d-core` = crossProject(JVMPlatform, JSPlatform)
// details omitted
// Turns the skeleton into the core JVM project --LR
lazy val `rtree2d-coreJVM` = `rtree2d-core`.jvm
// Turns the skeleton into the core JS project --LR
lazy val `rtree2d-coreJS` = `rtree2d-core`.js
lazy val `rtree2d-benchmark` = project
In sbt, commands can be scoped to particular modules with module/command, so in the interactive sbt shell (from the top-level), you can do
> rtree2d-coreJVM/console
to run the console within the JVM core module. You could also run sbt 'rtree2d-coreJVM/console' directly from the shell in the top level, though this may require some care around shell quoting etc.

How to shade a dependency for a non-executable Scala Library?

Spent a few hours trying to figure out how to do this. Over the course of it I have looked at a few seemingly promising questions but none of them seem to quite fit what I'm doing.
I've got three library jars, let's call them M, S, and H. Library M has things like:
case class MyModel(x: Int, s: String)
and then library S uses the play-json library, version 2.3.8, to provide implicit serializers for the classes defined by M
trait MyModelSerializer {
implicit val myModelFormt = Json.format[MyModel]
}
Which are then bundled up together into a convenience object for importing
package object implicits extends MyModelSerializer extends FooSerizlier // etc
That way, in Library H, when it performs HTTP calls to various services it just imports implicits from S and then I call Json.validate[MyModel] to get back the models I need from my web services. This is all well and dandy, but I'm working on an application that's running play 2.4 and when I included H into the project and tried to use it I ran up against:
java.lang.NoSuchMethodError: play.api.data.validation.ValidationError.<init>(Ljava/lang/String;Lscala/collection/Seq;)
Which I believe is being caused by play 2.4 using play-json version 2.4.6. Unfortunately, these are a minor version apart and this means that trying to just use the old library like:
// In build.sbt
"com.typesafe.play" %% "play-json" % "2.3.8" force()
Results in all the code in the app to fail to compile because I'm using things like JsError.toJson which weren't parts of play-json 2.3.8. I could change the 14 or so places trying to use that method, but given the exception before I have a feeling that even if I do that it's not going to help.
Around this point I remembered that back in my maven days I could shade dependencies during my build process. So I got to thinking that if I could shade the play-json 2.3.8 dependency in H that that would solve the problem. Since the problem seems to be that calling Json.* in H is using the Json object from play-json 2.4.6.
Unfortunately, the only thing I can find online that indicates the ability to shade is sbt-assembly. I found a great answer on how to do that for a fat jar. But I don't think I can use sbt-assembly because H isn't executable, it's just a library jar. I read through a question like my own but the answer refers to sbt-assembly so it doesn't help me.
Another question seems somewhat promising but I really can't follow how I would use it / where I would be placing the code itself. I also looked through the sbt manual, but nothing stuck out to me as being what I need.
I can't just change S to use play-json 2.4.6 because we're using H in a play 2.3 application as well. So it needs to be able to be used in both.
Right now the only thing I can really think to do if I can't get some kind of shading done is to make H not use S and to instead require some kind of serializer/deserializer implicitly and then wire in the appropriate json (dee)serializer. So here I am asking about how to properly shade with sbt with something that isn't an executable jar because I only want to do a re-write if I absolutely have to. If I missed something (like sbt-assembly being able to shade for non-executable jars as well), I'll take that as an answer if you can point me to the docs I must have missed.
As indicated by Yuval Itzchakov, sbt-assembly doesn't have to be building an executable jar and can shade library code as well. In addition, packing without transitive dependencies except the ones that need to be shaded can be done too and this will keep the packaged jar's size down and let the rest of the dependencies come through as usual.
Hunting down the transitive dependencies manually is what I ended up having to do, but if anyone has a way to do that automatically, that'd be a great addition to this answer. Anyway, this is what I needed to do to the H library's build file to get it properly shading the play-json library.
Figure out what the dependencies are using show compile:dependencyClasspath at the sbt console
Grab anything play related (since I'm only using play-json and no others I can assume play = needs shading)
Also shade the S models because they rely on play-json as well, so to avoid transitive dependencies bringing a non-shadded play 2.3.8 back in, I have to shade my serializers.
Add sbt-assembly to project and then update build.sbt file
build.sbt
//Shade rules for all things play:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("play.api.**" -> "shade.play.api.#1").inAll
)
//Grabbed from the "publishing" section of the sbt-assembly readme, excluding the "assembly" classifier
addArtifact(artifact in (Compile, assembly), assembly)
// Only the play stuff and the "S" serializers need to be shaded since they use/introduce play:
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
val toIncludeInPackage = Seq(
"play-json_2.11-2.3.8.jar",
"play-datacommons_2.11-2.3.8.jar",
"play-iteratees_2.11-2.3.8.jar",
"play-functional_2.11-2.3.8.jar",
"S_2.11-0.0.0.jar"
)
cp filter {c => !toIncludeInPackage.contains(c.data.getName)}
}
And then I don't get any exceptions anymore from trying to run it. I hope this helps other people with similar issues, and if anyone has a way to automatically grab dependencies and filter by them I'll happily update the answer with it.

How to share code between project and build definition project in SBT

If I have written some source code in my build definition project (in /project/src/main/scala) in SBT. Now I want to use these classes also in the project I am building. Is there a best practice? Currently I have created a custom Task that copies the .scala files over.
Those seem like unnecessarily indirect mechanisms.
unmanagedSourceDirectories in Compile += baseDirectory.value / "project/src/main"
Sharing sourceDirectories as in extempore's answer is the simplest way to go about it, but unfortunately it won't work well with IntelliJ because the project model doesn't allow sharing source roots between multiple modules.
Seth Tisue's approach will work, but requires rebuilding to update sources.
To actually share the sources and have IntelliJ pick up on it directly, you can define a module within the build.
The following approach seems to only work in sbt 1.0+
Create a file project/metabuild.sbt:
val buildShared = project
val buildRoot = (project in file("."))
.dependsOn(buildShared)
and in your build.sbt:
val buildShared = ProjectRef(file("project"), "buildShared")
val root = (project in file("."))
.dependsOn(buildShared)
Then put your shared code in project/buildShared/src/main/scala/ and refresh. Your project will look something like this in IntelliJ:
Full example project: https://github.com/jastice/shared-build-sources
Can you make the following work? Put the source code for the classes in question should be part of your project, not part of your build definition; the “task which serializes a graph of Scala objects using Kryo and writes them as files into the classpath of the project” part sounds like a perfect job for resourceGenerators (see http://www.scala-sbt.org/0.13.2/docs/Howto/generatefiles.html). Then the only remaining problem is how to reference the compiled classes from your resource generator. I'm not familiar with Kryo. In order to use it, do you need to have the compiled classes on the classpath at the time your generator is compiled, or do they just need to be on the classpath on runtime? If the latter is sufficient, that's easier. You can get a classloader from the testLoader in Test key, load the class and instantiate some objects via reflection, and then call Kryo.
If you really need the compiled classes to be on the classpath when your resource generator is compiled, then you have a chicken and egg problem where the build can't be compiled until the project has been compiled, but of course the project can't be compiled before the build definition has been compiled, either. In that case it seems to me you have no choices other than:
1) the workaround you're already doing ("best practice" in this case would consist of using sourceGenerators to copy the sources out of your build definition and into target/src_managed)
2) put the classes in question in a separate project and depend on it from both your build and your project. this is the cleanest solution overall, but you might consider it too heavyweight.
Hope this helps. Interested in seeing others' opinions on this, too.

Play Framework 2.2.x: Static assets location not working in production

Having trouble accessing the compiled assets location in production.
My strategy has been to serve my assets in "app/assets/ui" when in development and "public" when in production this is done as shown below in my conf/routes file
#{if(play.Play.mode.isDev())}
GET /assets/*file controllers.common.Assets.at(path="/app/assets/ui", file)
#{/}
#{else}
GET /assets/*file controllers.common.Assets.at(path="/public", file)
#{/}
Since i have defined asset mappings outside “public,”I have added the following line in my Build.scala
playAssetsDirectories <+= baseDirectory / "app/assets/ui"
As an example my scripts are loaded conditionaly depending on the environment as shown below
#if(play.Play.isDev()) {<script src="#routes.Assets.at("/app/assets/ui", "javascripts/application.js")"type="text/javascript"></script>} else {<script src="#.routes.Assets.at("/public", "javascripts/application.min.js")" type="text/javascript"></script>}
I'm using Grunt for my frontend workflow and when the application builds it copies the distribution files to the application's public folder.
I start the app in production using "sbt clean compile stage" and then run the packaged app.
My problem appears that the routes are still referring to the "app/assets/ui" folder instead of the distribution "public" folder.
Any tips on how i can debug this? My working background is as a front end developer so i'm very new to Play! and scala.
As mentioned by #estmatic, your conditional in routes won't be evaluated.
As it's generally extremely useful to consolidate the differences between application Modes into files, I'd suggest you extend GlobalSettings (if you aren't already) and override the onLoadConfig method:
class Settings extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, classloader: ClassLoader, mode: Mode.Mode): Configuration = {
val specificConfig:Config = // ... Use the mode param to load appropriate file
super.onLoadConfig(specificConfig, path, classloader, mode)
}
...
}
You could then have appropriately-named files (dev.conf and production.conf spring to mind) that contain suitable values, one of them being the base path for the Assets controller to use.
EDIT turns out doing it this way makes usage in routes awkward, here's another approach:
This approach does not use a configuration file per-environment, which means that if something changes in the frontend configuration (e.g. it's no longer served up from /public) you'll have to change this code and re-deploy it. However, it fits into Play 2.x pretty nicely:
package controllers
object EnvironmentSpecificAssets extends AssetsBuilder {
val modeToAssetsPathMap = Map(
Mode.Dev -> "/app/assets/ui",
Mode.Prod -> "/public")
lazy val modePath = modeToAssetsPathMap(Play.current.mode)
/** New single-argument `at`, determines its path from the current app mode */
def at(file:String): Action[AnyContent] = at(modePath, file)
}
The code is pretty self-explanatory, the only "trick" is probably the lazy val which means we only have to evaluate the current operating mode and do the map lookup once.
Now your routes file just looks like this:
GET /assets/*file controllers.EnvironmentSpecificAssets.at(file)
Playframework 2.x doesn't support conditional statements in the routes file. The 1.x versions had this but it was removed.
What you have in your routes file is simply two routes with the same URI pattern, /assets/file*. The other lines are just being ignored as comments since they begin with the pound character, #. I think since the pattern is the same for both the first route is catching everything and the second isn't doing anything.
It's not exactly what you're trying to do but I think you can just make the route patterns a bit different and it should work.
GET /assets/dev/*file controllers.common.Assets.at(path="/app/assets/ui", file)
GET /assets/*file controllers.common.Assets.at(path="/public", file)