Create a common library in Play framework - scala

I created 3 subprojects in play: A,B, and common
A and B needs to use common subproject.
the code looks like this for the build.sbt:
name := """play"""
organization := "com.play"
version := "1.0-SNAPSHOT"
lazy val common = (project in file("modules/common")).enablePlugins(PlayScala)
lazy val A = (project in file("modules/A")).enablePlugins(PlayScala)
.dependsOn(common).aggregate(common)
lazy val B= (project in file("modules/B")).enablePlugins(PlayScala)
.dependsOn(common).aggregate(common)
lazy val root = (project in file(".")).enablePlugins(PlayScala)
.dependsOn(A).aggregate(A)
.dependsOn(B).aggregate(B)
scalaVersion := "2.11.11"
libraryDependencies += filters
libraryDependencies += evolutions
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "2.0.0" % Test
The package system, I follow the documentation in the Play framework which is like "package.moduleName"
So, the class in model package in my common subproject has a package name called: "model.common"
Now in subproject A, i want to call the library in the common subproject
I call like this:
import model.common.className
I cannot find it

You can put all the classes inside a top level package.
For example for the common project you can put all the classes inside the com.play.common package.
After that you can use the class example(declare in the common project) from the project A using import com.play.common.A.

Related

maintain multi project modules with SBT

How to maintain multi module dependencies with Apache spark 2.3 in one module and Apache spark 2.4 in another one. What would be the project layer and how build.sbt looks like.
You can specify different dependencies for each module.
Let's assume you have module A and module B, it would look something like this:
lazy val moduleA = (project in file("moduleA"))
.settings(
name := "Module-A",
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "2.3.0")
)
lazy val moduleB = (project in file("moduleB"))
.settings(
name := "Module-B",
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "2.4.0")
)
The official documentation is pretty good, there are several examples

scala using github repo as a libraryDependencies

I was facing one error while trying using RootProject
I need to use gatling snapshot version which is 3.5.0-SNAPSHOT and to enable this I know two options:-
using git clone and then sbt publishLocal finally using this jar as an unmanaged dependency, This works fine but it is rather more manual work, So I moved to the second option.
Using sbt RootProject, so let me first explain the project setup:-
example/Build.sbt
lazy val gatling = RootProject(uri("https://github.com/gatling/gatling.git"))
lazy val root = (project in file("."))
.settings(
name := "example",
version := "0.1",
scalaVersion := "2.13.3"
).dependsOn(gatling)
example/project/plugins.sbt -> These two were added because it was required while building the project
addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.9.21")
libraryDependencies += "org.scala-sbt" %% "io" % "1.4.0"
example/src/main/scala/Main.scala
import io.gatling.core.scenario.Simulation
object Main extends App {
class A extends Simulation
val a = new A
println(a.toString)
}
So after this executing sbt run will result in a successful build.
But the problem starts when I tried to import gatling-highcharts in a similar fashion as:-
lazy val gatling = RootProject(uri("https://github.com/gatling/gatling.git"))
lazy val gatlingHighCharts = RootProject(uri("https://github.com/gatling/gatling-highcharts.git"))
lazy val root: Project = (project in file("."))
.settings(
name := "example",
version := "0.1",
scalaVersion := "2.13.3"
).dependsOn(gatling, gatlingHighCharts)
now executing sbt will result in an error as:-
[error] not found: C:\Users\user\.ivy2\local\io.gatling\gatling-recorder\3.5.0-SNAPSHOT\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/io/gatling/gatling-recorder/3.5.0-SNAPSHOT/gatling-recorder-3.5.0-SNAPSHOT.pom
[error] not found: https://jcenter.bintray.com/io/gatling/gatling-recorder/3.5.0-SNAPSHOT/gatling-recorder-3.5.0-SNAPSHOT.pom
And this was because gatling is being used via dependsOn while actually there is nothing available to do like:-
lazy val gatlingHighCharts = RootProject(uri("https://github.com/gatling/gatling-highcharts.git")).dependsOn(gatling)
This is not possible as there is no method named dependsOn inside RootProject class and I am not able to figure out how to make that work.
Can anyone help me to figure out how can I make that work?
Plus, if somehow exist a way to use github repo jars directly as managed sources instead of unmanaged sources which is something like
libraryDependencies += "io.gatling.highcharts" % "gatling-charts-highcharts" % "3.5.0-SNAPSHOT"
without using dependsOn.

SBT: How to include sbt files in upper directory to my build.sbt?

I have a many dependent sbt projects in one folder. They all have same values in Build.sbt, for example dependencies.
I want to move same values from all sbt files to separate file. But don't want to use multibuild. Just need to include some other sbt files from upper directory.
For example my directory structure can look like this:
MyRepository
|- Dependencies.sbt
|- MyProject1
|- src
|- Build.sbt
|- MyProject2
|- src
|- Build.sbt
In that example, how can I include Dependencies.sbt in Build.sbt?
Code is reused between .sbt files by creating a normal .scala file in project/. The code in project/ will be available for use in the .sbt files.
If I remember correctly the definitions in one .sbt are not visible to other .sbt files, at least on the older versions.
Basically, the solution is to use: Dependencies.scala and not Dependencies.sbt and define the common part's there.
Check the illustration, that can be found here,
You create project/Dependencies.scala to track dependencies in one place.
import sbt._
object Dependencies {
// Versions
lazy val akkaVersion = "2.3.8"
// Libraries
val akkaActor = "com.typesafe.akka" %% "akka-actor" % akkaVersion
val akkaCluster = "com.typesafe.akka" %% "akka-cluster" % akkaVersion
val specs2core = "org.specs2" %% "specs2-core" % "2.4.17"
// Projects
val backendDeps =
Seq(akkaActor, specs2core % Test)
}
The Dependencies object will be available in build.sbt.
You need to, import Dependencies._ in your build.sbt file.
import Dependencies._
ThisBuild / organization := "com.example"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.8"
lazy val backend = (project in file("backend"))
.settings(
name := "backend",
libraryDependencies ++= backendDeps
)

SBT multi module project: how to make static files (resources) available in main module?

I develop multi module SBT project. In general it's an akka api. It works well, when I run it locally and when I package it in docker.
Recently I added a new one module for email templates generation. I decided to use scalate mustache for this purpose. For a testing reason I created a simple template hello.mustache in email/src/main/resources/templates.
Then I run code which uses the template from the class located in email/src/main/scala. Everything compiled ok (scalate templates & scala code).
After I add a dependency to the email module to the security module which is included in app module:
import sbt.Keys._
import NativePackagerHelper._
lazy val core = project.in(file("core"))
.settings(name := "core")
.settings(Common.settings)
.settings(libraryDependencies ++= Dependencies.commonDependencies)
.enablePlugins(JavaAppPackaging)
lazy val email = project.in(file("email"))
.settings(name := "email")
.settings(Common.settings)
.settings(libraryDependencies ++= Dependencies.emailDependencies)
.enablePlugins(JavaAppPackaging)
lazy val contacts = project.in(file("contacts"))
.settings(name := "contacts")
.settings(Common.settings)
.dependsOn(core % "test->test;compile->compile")
.enablePlugins(JavaAppPackaging)
lazy val security = project.in(file("security"))
.settings(name := "security")
.settings(Common.settings)
.dependsOn(email, core % "test->test;compile->compile")
.enablePlugins(JavaAppPackaging)
lazy val app = project.in(file("."))
.enablePlugins(JavaAppPackaging, AshScriptPlugin, DockerPlugin)
.settings(name := "app")
.settings(Common.settings)
.dependsOn(core, security, contacts)
.settings(
mainClass in Compile := Some("com.app.Main"),
packageName in Docker := "app-backend",
version in Docker := "latest",
dockerBaseImage := "openjdk:8-jre-alpine",
dockerExposedPorts := Seq(5000)
)
I see the following errors, while trying to run the email code:
Exception in thread "main" org.fusesource.scalate.TemplateException: scala.tools.nsc.symtab.classfile.ClassfileParser$unpickler$.unpickle([BILscala/reflect/internal/Symbols$Symbol;Lscala/reflect/internal/Symbols$Symbol;Ljava/lang/String;)V
at org.fusesource.scalate.TemplateEngine.compileAndLoad(TemplateEngine.scala:886)
at org.fusesource.scalate.TemplateEngine.compileAndLoadEntry(TemplateEngine.scala:745)
...
How to make email module code work in another modules?
Additional info:
I try to run the code directly from IDE by run of the Main class from the app module.
Scala version 2.12.2; Scalate version 1.8.0; sbt version 0.13.8;
I'm afraid that you encountered a bin compatibility issues among several scala compiler versions. Explicitly overriding the scala lang version like this is preferred to avoid such problems.
dependencyOverrides := Set(
"org.scala-lang" % "scala-library" % scalaVersion.value,
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang" % "scala-compiler" % scalaVersion.value
),
In my particular case there was a problem in a conflict of log4j version of scalate and one other scala lib. So the solution which works for me is:
"org.scalatra.scalate" %% "scalate-core" % "1.8.0" excludeAll(ExclusionRule(organization = "org.slf4j"))

Unable to add scala-reflect as a dependency

I can't add scala-reflect as dependency. My project/build.sbt looks like this:
//the name of the project, will become the name of the war file when you run the package command.
name := "Test-SBT"
version := "1.0"
//specify which Scala version we are using in this project.
scalaVersion := "2.10.3"
libraryDependencies <++= (scalaVersion)(sv =>
Seq(
"org.scala-lang" % "scala-reflect" % "2.10.3",
"org.scala-lang" % "scala-compiler" % "2.10.3"
)
)
And project/build.properties
sbt.version=0.13.0
And here is my Main class:
object Main1 extends App {
import scala.reflect.runtime.universe
val runtimeMirror = universe.runtimeMirror(getClass.getClassLoader)
//......
}
It says object runtime is not a member of package reflect. Of course, I did "gen-idea", "clean" and other things. What's up with that?
Guessing here due the question by #laughedelic.
The build.sbt should be in the root. Assuming the project you are writing is in test-sbt, you should end up with structure similar to:
test-sbt/build.sbt
test-sbt/project
Otherwise the build.sbt is used in creating the "internal compile project" used by SBT.
A deeper explanation can be found at SBT's docs sbt is recursive.