sscalaJSModuleKind := ModuleKind.CommonJSModule - cannot invoke main method anymore :( - scala.js

I am trying to build a new facade that uses a lot of JSImport statements. I wanted to put it in a subfolder of a project I am currently working on, to improve it while at it.
Before my root build.sbt looked like this for the scala.js part:
lazy val client = (project in file("modules/client"))
.enablePlugins(ScalaJSPlugin, ScalaJSWeb)
.settings(generalSettings: _*)
.settings(
name := "client",
libraryDependencies += CrossDependencies.scalaTags,
persistLauncher := true
)
now I added this: scalaJSModuleKind := ModuleKind.CommonJSModule, which is incompatible with the persistLauncher setting, so I removed persistLauncher := true
Of course in my view I could no longer just add client-launcher.js. So I tried to wrap the main-method call manually, like this:
<script type="text/javascript">
tld.test.Test().main()
</script>
Now, this does NOT work IF scalaJSModuleKind := ModuleKind.CommonJSModule is added to my build.sbt. If I remove that setting everything works just fine.
This is my Test
package tld.test
import org.scalajs.dom
import scala.scalajs.js.JSApp
object Test extends JSApp
{
import scalatags.JsDom.all._
def main(): Unit =
{
// Add js script dynamically
val s = script(
"alert('Hello World!')"
)
dom.document.getElementsByTagName("head")(0).appendChild(s.render)
}
}
Now, if I remove that ModuleKind-setting an alert pops up with 'Hello World', but if it's there nope. What is causing this and how can I prevent it?
edit
After answer from #sjrd I tried the following:
plugins.sbt:
addSbtPlugin("ch.epfl.scala" % "sbt-scalajs-bundler" % "0.5.0")
addSbtPlugin("ch.epfl.scala" % "sbt-web-scalajs-bundler" % "0.5.0")
build.sbt:
lazy val client = (project in file("modules/client"))
.enablePlugins(ScalaJSBundlerPlugin, ScalaJSWeb) // ScalaJSBundlerPlugin automatically enables ScalaJSPlugin
.settings(generalSettings: _*)
.settings(
name := "client"
, libraryDependencies += CrossDependencies.scalaTags
//, scalaJSModuleKind := ModuleKind.CommonJSModule // ScalaJSBundlerPlugin implicitly sets moduleKind to CommonJSModule enables ScalaJSPlugin
)
lazy val server = (project in file("modules/server"))
.enablePlugins(PlayScala, WebScalaJSBundlerPlugin)
.settings(generalSettings: _*)
.settings(
name := "server"
,libraryDependencies ++= Seq(
CrossDependencies.scalaTest,
CrossDependencies.scalactic,
CrossDependencies.scalaTags,
"com.typesafe.play" %% "play-json" % "2.6.0-M1")
,scalaJSProjects := Seq(client)
,pipelineStages in Assets := Seq(scalaJSPipeline)
//,pipelineStages := Seq(digest, gzip)
,compile in Compile := ((compile in Compile) dependsOn scalaJSPipeline).value
)
But during compilation I get:
ERROR in ./fastopt-launcher.js
[info] Module not found: Error: Cannot resolve 'file' or 'directory' /home/sorona/scalajstestbed/modules/client/target/scala-2.12/scalajs-bundler/main/client-fastopt.js in /home/sorona/scalajstestbed/modules/client/target/scala-2.12/scalajs-bundler/main
edit: Solution is to then include client-fastopt-bundle.js et voila

Changing the module kind significantly changes the shape of the output file, include its external "specification". In particular, it is not a script that can be embedded in Web page anymore. Instead, it is a CommonJS module.
To be able to include it in a Web page, you will need to bundle it. The best way to do so is too use scalajs-bundler.

Related

sbt - deep child modules

I'm new to sbt and I want to reproduce a complex project structure with many nested modules.
For example, I have the following structure:
.
build.sbt
|_web
|_api
|_dto
|_domain
build.sbt is as follows:
name := "myProject"
version := "1.0"
scalaVersion := "2.12.4"
resolvers += Resolver.sonatypeRepo("public")
libraryDependencies += "com.typesafe.play" %% "play" % "2.6.10"
lazy val commonSettings = Seq(
organization := "com.example",
version := "0.1",
scalaVersion := "2.12.4"
)
// root module
lazy val root = (project in file("."))
.aggregate(domain, web)
// domain module
lazy val domain = project.settings(commonSettings)
// web module
lazy val web = project.settings(
commonSettings,
libraryDependencies := Seq("com.typesafe.play" %% "play" % "2.6.10"),
name := "myproj-web"
).dependsOn(domain)
// web api module
lazy val webApi = (project in file("./web/api")).settings(
commonSettings,
libraryDependencies := Seq("com.typesafe.play" %% "play" % "2.6.10"),
name := "myproj-web-api"
).dependsOn(domain)
First problem I have is I can't access my libraries in web/api, though I can in web/.
Second problem is that I don't like file("./web/api"). Is it possible to make sbt understand nested folders as it understands plain folders (like web or domain).
Also, is it possible then to have build.sbt for each module. For example, for web to contain build file for api and dto, but preserving aggregations and ability to call build only on root project and have all the rest projects be built.

how to declare and use object in this build.sbt?

I am trying to understand how to write a well-written build.sbt file, and followed a tutorial on youtube. In that tutorial an object is created similar as below, but what I have written gives the shown error.
What am I doing wrong?
I have tried to remove blank lines between imports and the object declaration without any change.
import sbt._
import sbt.Keys._
object BuildScript extends Build {
lazy val commonSettings = Seq(
organization := "me",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "deepLearning",
libraryDependencies += "org.deeplearning4j" % "deeplearning4j-core" % "0.4-rc3.4"
)}
error message:
error: illegal start of simple expression
object BuildScript extends Build {
^
[error] Error parsing expression. Ensure that there are no blank lines within a setting.
I think this thread actually explains it: What is the difference between build.sbt and build.scala?
I feel that the edit made by chris martin on this post was unnecessary, but can't reject it.
I think your tutorial was out of date. Using a recent version of sbt (0.13+ or so) you really want to do this:
lazy val commonSettings = Seq(
organization := "me",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings).
settings(
name := "deepLearning",
libraryDependencies += "org.deeplearning4j" % "deeplearning4j-core" % "0.4-rc3.4"
)
If your project doesn't have any subprojects, though, the commonSettings val is somewhat superfluous, and you can just inline it:
lazy val root = (project in file(".")).
settings(
organization := "me",
name := "deepLearning",
version := "0.1.0",
scalaVersion := "2.11.4",
libraryDependencies += "org.deeplearning4j" % "deeplearning4j-core" % "0.4-rc3.4"
)
If you do have subprojects, and wind up with a lot of common settings, you may want to pull them out into an autoplugin, but that's a more advanced concept.
There are two ways to fix this:
Do object BuildScript extends Build { ... } and use .scala extension for the build file. I think this way is the recommended long term Sbt build file style.
Change build definition to gregsymons's answer and keep the .sbt extension.

Adding module dependency information in sbt's build.sbt file

I have a multi module project in IntelliJ, as in this screen capture shows, contexProcessor module depends on contextSummary module.
IntelliJ takes care of everything once I setup the dependencies in Project Structure.
However, when I run sbt test with the following setup in build.sbt, I got an error complaining that it can't find the packages in contextSummary module.
name := "contextProcessor"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
How to teach sbt that the missing modules are found?
I could use the build.sbt file in the main root directory.
lazy val root = (project in file(".")).aggregate(contextSummary, contextProcessor)
lazy val contextSummary = project
lazy val contextProcessor = project.dependsOn(contextSummary)
Reference: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html
For testing only one project, I can use project command in sbt.
> sbt
[info] Set current project to root (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> project contextProcessor
[info] Set current project to contextProcessor (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> test
For batch mode as in How to pass command line args to program in SBT 0.13.1?
sbt "project contextProcessor" test
I think a simple build.sbt might not be enough for that.
You would need to create a more sophisticated project/Build.scala like that:
import sbt._
import sbt.Keys._
object Build extends Build {
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(module1, module2)
)
lazy val module1 = Project(
id = "module1",
base = file("module1-folder"),
settings = Seq(
name := "Module 1",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
lazy val module2 = Project(
id = "module2",
base = file("module2-folder"),
dependencies = Seq(module1),
settings = Seq(
name := "Module 2",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
}

How to set up jacoco4sbt to process classes in main and submodules in Play?

I'm having some problems to make jacoco4sbt working with my Play 2.3.4 project.
My project is composed of 3 submodules: common, api and frontend and has no code in the app root folder. Now when I run Jacoco it does not find the submodules classes.
Inspecting target/scala-VERSION/classes I only find some routing classes (which in fact is the only code I have in my "root" project, but I was expecting that because I aggregate all those projects the classes would be there).
If I copy the classes from MODULE_NAME/target/scala-VERSION/classes to target/scala-VERSION/classes and then run Jacoco I get the expected result.
So what is the best way to make it work? I can't find any config in jacoco4sbt to specify additional classes locations.
My build.sbt file
import Keys._
// Dummy value to deal with bug in sbt 0.13.5
val k = 0
name := "PlayApp"
version := "0.5.0"
// omitted resolvers part
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"com.edulify" %% "play-hikaricp" % "1.5.0" exclude("com.jolbox", "bonecp"),
"com.novocode" % "junit-interface" % "0.11" % "test"
)
lazy val common = project.in(file("common")).enablePlugins(PlayJava)
lazy val frontend = project.in(file("frontend")).enablePlugins(PlayJava).dependsOn(common)
lazy val api = project.in(file("api")).enablePlugins(PlayJava).dependsOn(common)
lazy val main = project.in(file(".")).enablePlugins(PlayJava)
.aggregate(frontend, api).dependsOn(frontend, api)
parallelExecution in Test := false
javaOptions in Test += "-Dconfig.resource=test.conf"
jacoco.sbt
import de.johoop.jacoco4sbt._
import JacocoPlugin._
jacoco.settings
Keys.fork in jacoco.Config := true
parallelExecution in jacoco.Config := false
jacoco.outputDirectory in jacoco.Config := file("target/jacoco")
jacoco.reportFormats in jacoco.Config := Seq(XMLReport("utf-8"), HTMLReport("utf-8"))
jacoco.excludes in jacoco.Config := Seq("views*", "*Routes*", "controllers*routes*", "controllers*Reverse*", "controllers*javascript*", "controller*ref*")
javaOptions in jacoco.Config += "-Dconfig.resource=test.conf"
Add jacoco.sbt to every subproject with the following content:
jacoco.settings
p.s. I've been looking for ways to convince sbt to have jacoco.settings applied to every subproject in the top-level root build.sbt, but to no avail.

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.