I'm developing a system to discover subprojects at compile time. This works. See here. Now the only issue is that the subproject's route file is being ignored.
I know that the normal way to include a route file in the main route file is by hardcoding the latter into the former. But that would defy my goal of dynamic subprojects.
I bet that there's a way to, in Build.scala, discover a route file and append it to the main route file. But I'm a beginner and I have no idea how to do it. Could you please help me out?
Alternatively, if there's no way to do it at compile time, maybe there's a way to load it at runtime? I know there's an api to intercept requests. So if we can read the routes we could implement dynamic routing that way. Is that a good idea?
Your submodules could implement their own routing DSL. See example in api doc. Optionally, you could hook into Compile task in your root project and append all the routes to main routes file programmatically.
In the end i had to write fragments of the routing file (one per each sub-project, use a different extension like subproject.routes for example) and then concatenate them all together into a single routes file. You have to do this also for the application.conf file.
I did this via the Build.sbt script:
import sbt._
import Keys._
import play._
import java.io._
object Build extends Build {
val commonSettings: Seq[Setting[_]] = Seq(
scalaVersion := "2.11.1"
)
IO.copyFile(file("conf/base.routes"), file("conf/routes"))
IO.copyFile(file("conf/base.application.conf"), file("conf/application.conf"))
lazy val libFolder = file("base-lib");
lazy val baseLib = processModule(libFolder)
lazy val modules = (file("modules") * DirectoryFilter).get.map { dir =>
processModule(dir).dependsOn(baseLib)
}
lazy val root = (project in file("."))
.enablePlugins(PlayJava)
.settings(
name := "mainProject",
version := "1.0"
)
.dependsOn(modules map (m => m: ClasspathDependency): _*)
.aggregate(modules map (m => m: ProjectReference): _*)
override lazy val projects = root +: modules +: dspcloudLib
def processModule(dir: File):Project = {
val p = Project(dir.getName, dir).enablePlugins(PlayJava).settings(commonSettings: _*)
val mf = new File(dir, "conf/" + dir.getName + ".r")
val r = IO.read(mf)
IO.append(file("conf/routes"), r.toString)
val cf = new File(dir, "conf/" + dir.getName + ".application.conf")
val c = IO.read(cf)
IO.append(file("conf/application.conf"), c.toString)
p
}
}
Related
I am defining multiple JVM/JS cross projects. Each one contains some common JVM/JS scala code that I want to extract into a general common project that each project can depend on. Could someone recommend me the best way to define my build.scala files for the general and dependent projects?
CrossProject supports the normal dependsOn operation you are used to. So you can:
// call to settings needed so for an implicit conversion to kick in
lazy val common = crossProject.settings()
lazy val p1 = crossProject.dependsOn(common)
lazy val p2 = crossProject.dependsOn(common)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
lazy val p1JVM = p1.jvm
lazy val p1JS = p1.js
lazy val p2JVM = p2.jvm
lazy val p2JS = p2.js
There is a full example on GitHub.
You can create Multi-project builds
Let's say you have project structure like this;
root
project/Build.scala
project1
src/
project1.sbt
project2
src/
project2.sbt
projectN
src/
projectN.sbt
You can easily define dependencies in Build.scala
lazy val root = Project(id = "Main-Project",
base = file(".")) aggregate(project1, project2,..)
lazy val project2 = Project(id = "project2",
base = file("project1")).dependsOn(project1)
...
I ended up arriving at the solution below.
lazy val common = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
...
lazy val p1 = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
).
jvmConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJVM"))).
jsConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJS")))
lazy val p1JVM = p1.jvm.
settings(...
lazy val p1JS = p1.js.
settings(...
I have a Build with two projects in it.
I want to make the root project classpath dependent on subProject, but only in certain configuration. Simplified project's config:
Subproject:
object HttpBuild{
import Dependencies._
lazy val http: Project = Project(
"http",
file("http"),
settings =
CommonSettings.settings ++
Seq(
version := "0.2-SNAPSHOT",
crossPaths := false,
libraryDependencies ++= akkaActor +: spray) ++
Packaging.defaultPackageSettings
)}
Root:
object RootBuild extends Build {
import HttpBuild._
lazy val http = HttpBuild.http
lazy val MyConfig = config("myconfig") extend Compile
private val defaultSettings = Defaults.coreDefaultSettings
lazy val api = Project("root", file("."))
.configs(MyConfig)
.settings(defaultSettings: _*)
.dependsOn(HttpBuild.http % MyConfig)
}
Now if i type myconfig:compile i want to have my root project compiled with subproject, but it doesn't seem to happen.
If i leave dependencies like this dependsOn(HttpBuild.http), it compiles, but it happens every time, no matter which configuration i use.
Have you looked at this example. I'm not an expert here, but comparing with your code above, the difference seems to be
that a CustomCompile configuration is defined and used as classpathConfiguration in Common := CustomCompile
that the dependency is indirect http % "compile->myconfig"
Perhaps try to get closer to that example.
I'm write a SBT task, which can output the dependencies information, grouped by project (say a SBT project has multi projects)
I know there is a sbt-dependency-graph plugin, but I can use it directly, because I want to generate a json file, but that plugin just output the dependency tree to console, without returning an data object, I can't easily get the data I want.
I found the update task returns a UpdateReport which contains a lot of information I want, but it only belong to the current project. In command line, if I want to know the information of all project, I can manually show all the projects by projects command, and view them one by one by someproject/update.
But how to do the same in a SBT task? I tried:
val reports = projects.toList.map(prj => (update in prj).value)
It reports:
[error] /Users/me/workspace/sbt-test/project/Build.scala:51: Illegal dynamic reference: prj
[error] val reports = projects.toList.map(prj => (update in prj).value)
[error] ^
[error] one error found
How to fix it?
More code:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val allUpdate = taskKey[Unit]("show update reports of all projects")
lazy val core = project
lazy val web = project
lazy val allUpdateDef = allUpdate := {
val reports = projects.toList.map(prj => (update in prj).value)
println(reports)
}
lazy val root = (project in file("."))
.settings(
allUpdateDef
)
}
After checking the document: http://www.scala-sbt.org/0.13/docs/Tasks.html, I found the solution:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val groupByProject: Def.Initialize[Task[(String, UpdateReport)]] =
Def.task {
(thisProject.value.id, (update in thisProject).value)
}
lazy val filter = ScopeFilter(inAnyProject, inAnyConfiguration)
updateByProject := {
val subProjects = groupByProject.all(filter).value.map { case ( projectName, updateReport) =>
...
}
}
}
I have an existing database, and i would like to connect to it with scala/slick.
I'd rather not have to manually write all of the slick classes, to wrap around my tables.
is there a quick way to just read the definitions from the database, from slick? or, possibly, is there another component in the scala standard library or standard toolset, which will do this work for me?
Use the Slick schema generator, you simply need to add this to your Build.scala:
lazy val slick = TaskKey[Seq[File]]("gen-tables")
lazy val slickCodeGenTask = (sourceManaged, dependencyClasspath in Compile, runner in Compile, streams) map {
(dir, cp, r, s) => {
val outputDir = (dir / "slick").getPath
val url = "your db url"
val jdbcDriver = "dbms drivers"
val slickDriver = "slick drivers"
val pkg = "schema"
toError(r.run("scala.slick.model.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, pkg), s.log))
val fname = outputDir + "/path/to/Tables.scala"
Seq(file(fname))
}
}
Add the task to the settings:
val main = play.Project(appName, appVersion, Seq()).settings(
Keys.fork in (Test) := false,
libraryDependencies := Seq(
...
),
slick <<= slickCodeGenTask // register manual sbt command
)
And then call genTables form SBT, this will create a scala file called Tables.scala to the specified path with the whole schema from the database.
This was the Github example I looked up the first time.
I've been searching if this is possible for a while with little success.
Using SBT, can you create a sub-project programmatically, without explicitly assigning each project to it's own val?
My current project structure looks something like this:
root/
common/ <--- This is another sub-project that others dependOn
project/
build.scala
src/main/scala
apps/ <--- sub-projects live here
Sub1/
Sub2/
Sub1 and Sub2 are both their own SBT projects.
My first attempt to link these projects together looked like this:
// root/project/build.scala
import sbt._
import Keys._
object build extends Build {
lazy val common = project /* Pseudo-code */
val names = List("Sub1", "Sub2")
lazy val deps = names map { name =>
Project(id = name, base = file(s"apps/$name")).dependsOn(common)
}
lazy val finalDeps = common :: deps
lazy val root = project.in(file(".")).aggregate(finalDeps.map(sbt.Project.projectToRef) :_*)
.dependsOn(finalDeps.map(ClassPathDependency(_, None)) :_*)
}
However, because SBT uses reflection to build it's projects and sub-projects, this doesn't work.
It only works if each sub-project is stated explicitly:
lazy val Sub1 = project.in(file("apps/Sub1"))
So the question:
Is there a way to programmatically build sub-project dependencies in SBT?
Sbt allows for making a build definition for the build itself:
http://www.scala-sbt.org/release/docs/Getting-Started/Full-Def.html
You can try creating a project/project/build.scala file that contains a source generator, something like this:
// project/project/build.scala
sourceGenerators in Compile <+= sourceManaged in Compile map { out =>
Generator.generate(out / "generated")
}
EDIT: You should implement the Generator object yourself.
This source generator will in turn scan the topmost apps folder and create a source for an object that contains all the subprojects.
// project/subprojects.scala
// This is autogenerated from the source generator
object Subprojects{
lazy val Sub1 = project.in(file("apps/Sub1"))
lazy val Sub2 = project.in(file("apps/Sub2"))
lazy val all = Seq(Sub1,Sub2)
}
Now in your main build.scala just write:
// project/build.scala
lazy val root = project.in(file("."))
.aggregate(Subprojects.all.map(sbt.Project.projectToRef) :_*)
.dependsOn(Subprojects.all.map(ClassPathDependency(_, None)) :_*)
I didn't run all this through a compiler so some errors are possible but the principle should work.
EDIT: I created a repo on Github where I implemented the solution. Go there and see how it is done.
https://github.com/darkocerdic/sbt-auto-subprojects