How to add "provided" dependencies back to run/test tasks' classpath? - scala

Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.

For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated

Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.

Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)

If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

Related

Is there a way to specify different Scala scapegoat versions for multi-project SBT build?

I was trying the following:
lazy val root = (project in file("."))
.settings(
libraryDependencies ++= commonDeps,
scapegoatVersion := "1.3.5"
)
lazy val foo = (project in file("foo"))
.settings(
libraryDependencies ++= Dependencies.fooDeps,
scalaVersion := "2.13.8",
scapegoatVersion := "1.4.12"
)
Each project's version resolved to 1.0.
I also tried ThisBuild / scapegoatVersion := <some-version> but it set the version for all projects.
Unfortunately, it is not possible because of a wrong reference to scapegoatVersion setting.
libraryDependencies ++= Seq(crossVersion(GroupId %% ArtifactId % (scapegoatVersion in ThisBuild).value % Provided)))
The problem is that it adds libraryDependencies to projectSettings or scope ThisProject but uses a broader scope ThisBuild to refer to scapegoatVersion.
Because of this it ignores your scapegoatVersion specified per project.
Your approach would work if the plugin did the following
libraryDependencies ++= Seq(crossVersion(
GroupId %% ArtifactId % (scapegoatVersion in ThisProject).value % Provided)
))
You could submit a PR to the project.

Can I create a proto jar for scalaVersion 2.11/2.12 and use it within the same sbt under different sub-project?

I have a set of .proto files (protobuf) which I generate java from using scalapb. I also have in the same sbt 2 sub-projects, one is scalaVersion 2.11 compatible (can't upgrade it to 2.12 due to missing packages) and the other one is scala 2.12.
I created a sub-project to hold my proto, and by default 2.12 is used and my 2.12 sub-project can use it, but my 2.11 can't.
I set the crossScalaVersions to 2.11/2.12, I compiled my project with both, which passed, but then even then I was unable to get the 2.11 sub-project to find that code.
I am "wondering" if that is something supported, or if there is a track I could use a single location to hold my .proto yet have my 2 sub-projects using the same sbt file use those.
lazy val scala212 = "2.12.13"
lazy val scala211 = "2.11.12"
lazy val supportedScalaVersion = List(scala212, scala211)
ThisBuild / scalaVersion := scala212
lazy val root = (project in file("."))
.aggregate(proto, subproject1, subproject2)
.settigns(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val proto = project
.settings(
crossScalaVersions := supportedScalaVersions,
name := "proto",
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
PB.targets in Compile := Seq(
scalapb.gen(grpc = false) -> (sourceManaged in Compile).value / "protobuf"
)
)
lazy val subproject1 = project
.dependsOn(proto)
lazy val subproject2 = project
.settings(
scalaVersion := scala211
)
.dependsOn(proto)
So, from the above, if I do sbt "+ proto" I can compile both versions. If I do sbt subproject1/compile it works too. Using sbt subproject2/compile fails indicating that it cannot find the 2.11:proto jar file.
Either, I would like the above somehow to work nicely, or any other trick that I could generate the code from the same proto location but within subproject1/subproject2 would be appreciated.
You could try the sbt-projectmatrix plugin:
https://github.com/sbt/sbt-projectmatrix
The idea is to have separate sbt subprojects for the different Scala versions, so you can simply reference the relevant subproject when calling dependsOn.
I think this plugin is going to end up in sbt some day as it's a much better solution in general than the current built-in stateful cross compilation support, and it's developed by Eugene Yokota, who is also an sbt developer.

Play framework's sbt import maven module error

I am using Playframework 2.5.4 to serve as REST API endpoint in default sbt build, compile & run with activator.
This play module imports a core module in local maven (3.9), where types were defined. Such as:
package com.myCompany
case class User(
id: Option[UUID] = None,
username: String
)
Then play controller package will import com.myCompany.User.
Issue is, when activator compiles, it gives unknown parameter name: username
If I placed case class in play's models package, it compiles without error.
Below is my build.sbt
name := """api"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
jdbc,
cache,
ws,
filters,
"com.myCompany" % "core" % "1.0-SNAPSHOT"
)
routesGenerator := InjectedRoutesGenerator
resolvers ++= Seq(
"scalaz-bintray" at "http://dl.bintray.com/scalaz/releases",
Resolver.mavenLocal
// "Local Maven Repository" at "file:///home/pocheng/.m2/repository"
)
I suspect build.sbt file is having issue. Would appreciate any pointer. Thanks.

sbt native packager doesn't create scripts under target/universal/stage/bin

I'm trying to use JavaAppPackaging from sbt-native-packager. My understanding is, that when I run:
sbt stage
I should get a directory target/universal/stage/bin with some startup scripts. Now I only get lib which contains my jar and it's dependencies.
Here's the relevant parts of my build.sbt:
val scalatra = "org.scalatra" %% "scalatra" % "2.3.1"
enablePlugins(JavaAppPackaging)
lazy val root = (project in file(".")).
settings(
name := "myapp",
version := "0.2",
scalaVersion := "2.11.6",
libraryDependencies += scalatra
)
Also, my plugins.sbt has this:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
I'm using sbt 0.13.8.
So why don't I get the startup scripts, what am I missing?
You need make sure sbt finds a main for the script.
This could mean either make sure you have a main: an object that extends App or one that defines a def main(args: Array[String]): Unit.
Otherwise try setting mainClass, like so:
mainClass in Compile := Some("JettyLauncher")
Try setting the main class without any scopes: mainClass := Some ("full.path.to.MainApp")

How to run playframework sample in Intelllij 13?

I am using Intellij 13 Ultimate and want to create a Play Framework sample. But I am unable to build this project because it always throws this error:
object index is not a member of package views.html
Ok(views.html.index("Your new application is ready."))
I have tried this on both Mac and Windows platforms, but the error is always the same.
How can I solve this?
The generation works just fine and all the paths are correctly added to the build. However the routes are not (yet) compiled by the plugins (scala+playframework), thus the reverse routing classes generated by play are not available for intellij.
You will have the same problem with the templates.
If you run a play compile (or a sbt compile), the classes will be generated and intellij should be able to pick them up and compile your project.
I found a trick here.
If you generate the play-scala project using activator command line activator [new my_first_project play-scala]. You'll get the following sbt build file.
name := """my_first_project"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
jdbc,
cache,
ws,
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
But IF you create a project from intellj using NewProject->Scala-Play 2.x, you will get the following sbt.
name := "my_second_project"
version := "1.0"
lazy val `my_second_project` = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc , cache , ws , specs2 % Test )
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
fork in run := false
after combine them. Ignore the name. And I set for in run to false
name := "my_second_project"
version := "1.0"
lazy val `my_second_project` = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc , cache , ws , specs2 % Test )
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
fork in run := false
After doing that. Open the activator-generated project with Intellj, configure a run with Play 2.x run. Everything goes well.
By the way, if you open the activator-generated project before you change the sbt file. You might need to rm -r .idea
Hope that helps.
Running "play idea" in the new project root fixed it for me. The project should compile and run after reloading.
I only have this problem with projects created through Idea's New Project menu.
I have the same problem: in Idea 13 there are highlight errors there, but in Idea 12 everything is ok. The reason is that there is no Scala plugin for Idea 13, so for now it's impossible to get it to work in Idea 13.
The solution is to install Idea 12, go to Preferences -> Plugins, type "Scala" in find box, and install Scala plugin.