sbt sometimes takes long to compile (super long..) - scala

Currently I'm running a fairly middle-sized project with the Play Framework and sbt/scala. However, sometimes it takes long time to compile the project, like 4 classes taking something like 120 seconds, while most of the time the project takes just a bunch of seconds for everything!
However mostly it looks that it takes a while longer for a bunch of files when refactoring some things inside my Database Access and Services, while creating private / public functions of slick code without return codes.
Still is there anything to do against this ?
It just looks like sbt is hanging while doing:
[info] Loading project definition from /Users/schmitch/projects/envisia/envisia-erp-loki/project
[info] Set current project to envisia-erp-loki (in build file:/Users/schmitch/projects/envisia/envisia-erp-loki/)
[info] Compiling 4 Scala sources to /Users/schmitch/projects/envisia/envisia-erp-loki/modules/utils/target/scala-2.11/classes...
while utils is a sbt subproject. Any idea's why it takes so long? and mostly while that only happens sometimes?
Here is my sbt file:
name := "envisia-erp-loki"
version := "1.0.1-SNAPSHOT"
lazy val utils = project in file("modules/utils")
lazy val migrator = (project in file("modules/migrator"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val codegen = (project in file("modules/codegen"))
.dependsOn(utils).aggregate(utils)
lazy val auth = (project in file("modules/auth"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val pdfgen = project in file("modules/pdfgen")
lazy val root = (project in file("."))
.enablePlugins(PlayScala, DebianPlugin)
.dependsOn(utils).aggregate(utils)
.dependsOn(auth).aggregate(auth)
.dependsOn(pdfgen).aggregate(pdfgen)
scalaVersion in ThisBuild := "2.11.6"
libraryDependencies ++= Seq(
filters,
cache,
ws,
specs2 % Test,
"com.typesafe.play" %% "play-slick" % "1.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "1.0.0",
"com.typesafe.play" %% "play-mailer" % "3.0.1",
"org.elasticsearch" % "elasticsearch" % "1.5.2",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"com.chuusai" %% "shapeless" % "2.2.0",
"com.nulab-inc" %% "play2-oauth2-provider" % "0.15.0",
"io.github.nremond" %% "pbkdf2-scala" % "0.4",
"com.google.code.findbugs" % "jsr305" % "3.0.0"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots")
)
// Faster compilations:
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// "com.googlecode.java-diff-utils" % "diffutils" % "1.3.0"
//pipelineStages := Seq(rjs)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
JsEngineKeys.engineType := JsEngineKeys.EngineType.Node
// Do not add API Doc
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// RpmPlugin
maintainer in Linux := "Christian Schmitt <c.schmitt#envisia.de>"
packageSummary in Linux := "Envisia ERP Server 3.0"
packageDescription := "This is the new Envisia ERP Server it will be as standalone as possible"
import com.typesafe.sbt.packager.archetypes.ServerLoader.Systemd
serverLoading in Debian := Systemd
scalacOptions in ThisBuild ++= Seq(
"-target:jvm-1.8",
"-encoding", "UTF-8",
"-deprecation", // warning and location for usages of deprecated APIs
"-feature", // warning and location for usages of features that should be imported explicitly
"-unchecked", // additional warnings where generated code depends on assumptions
"-Xlint", // recommended additional warnings
"-Ywarn-adapted-args", // Warn if an argument list is modified to match the receiver
"-Ywarn-value-discard", // Warn when non-Unit expression results are unused
"-Ywarn-inaccessible",
"-Ywarn-dead-code"
)
updateOptions in ThisBuild := updateOptions.value.withCachedResolution(true)

Related

How to write Intellij IDEA plugin for scala macros

I wrote macro that is working after compilation, but the problem is that Intellij IDEA doesn't see my generated code and some red lines appear. So i found explanation here, that I need to write IDEA plugin that will allow IDEA recognize my generated code. The problem is that i cannot use SyntheticMembersInjector because of missing dependency. Is it possible to write IDEA plugin for my own scala macros?
my plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.2")
addSbtPlugin("org.jetbrains" % "sbt-idea-plugin" % "3.8.4")
my build.sbt:
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport.dockerExposedPorts
import sbt.Keys.{scalacOptions, version}
lazy val coreProject = (project in file("."))
.enablePlugins(JavaAppPackaging, DockerPlugin, AkkaGrpcPlugin)
.settings(
scalaVersion := "2.12.12",
name := "CDMS",
version := "0.1",
libraryDependencies ++= BuildConfig.projectDependencies,
dockerBaseImage := "adoptopenjdk/openjdk15:alpine",
dockerExposedPorts += 9002
)
.dependsOn(validationProject)
lazy val validationProject = (project in file("validation"))
.enablePlugins(SbtPlugin)
.settings(
scalaVersion := "2.12.12",
sbtPlugin := true,
libraryDependencies ++= BuildConfig.monocleDependencies
)

How to set the Docker Registry with sbt-native-packager

I'm trying to build a Docker image using sbt-native-packager with the following build.sbt (trying to publish the image to a local repository)
val sparkVersion = "2.4.5"
scalaVersion in ThisBuild := "2.12.0"
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
// JAR build settings
lazy val commonSettings = Seq(
organization := "dzlab",
version := "0.1",
scalaSource in Compile := baseDirectory.value / "src",
scalaSource in Test := baseDirectory.value / "test",
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javacOptions ++= Seq(),
scalacOptions ++= Seq(
"-deprecation",
"-feature",
"-language:implicitConversions",
"-language:postfixOps"
),
libraryDependencies ++= sparkLibs
)
// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion
lazy val root = (project in file("."))
.enablePlugins(
DockerPlugin,
JavaAppPackaging
)
.settings(
name := "spark-k8s",
commonSettings,
dockerAliases ++= Seq(
dockerAlias.value.withRegistryHost(Some("localhost:5000"))
),
mainClass in (Compile, run) := Some("dzlab.SparkJob")
)
SBT and the packager versions
$ cat project/plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
$ cat project/build.properties
sbt.version=0.13.18
When I try to run the packager
$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
dockerAliases ++= Seq(
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
It does not recognize dockerAliases not sure why as it is part of the publishing settings.
What is the proper way to set the Docker registry??
Your sbt-native-packager version is hopelessly outdated, as is your sbt version. That SettingKey doesn't exist in that version.
Compare: sbt-native-packager 1.0 vs. sbt-native-packager 1.7.4

SBT, how to add unmanaged JARs to IntelliJ?

I have build.sbt file:
import sbt.Keys.libraryDependencies
lazy val scalatestVersion = "3.0.4"
lazy val scalaMockTestSupportVersion = "3.6.0"
lazy val typeSafeConfVersion = "1.3.2"
lazy val scalaLoggingVersion = "3.7.2"
lazy val logbackClassicVersion = "1.2.3"
lazy val commonSettings = Seq(
organization := "com.stulsoft",
version := "0.0.1",
scalaVersion := "2.12.4",
scalacOptions ++= Seq(
"-feature",
"-language:implicitConversions",
"-language:postfixOps"),
libraryDependencies ++= Seq(
"com.typesafe.scala-logging" %% "scala-logging" % scalaLoggingVersion,
"ch.qos.logback" % "logback-classic" % logbackClassicVersion,
"com.typesafe" % "config" % typeSafeConfVersion,
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"org.scalamock" %% "scalamock-scalatest-support" % scalaMockTestSupportVersion % "test"
)
)
unmanagedJars in Compile += file("lib/opencv-331.jar")
lazy val pimage = project.in(file("."))
.settings(commonSettings)
.settings(
name := "pimage"
)
parallelExecution in Test := true
It is working fine, if I use sbt run, but I cannot run from IntelliJ.
I receive error:
java.lang.UnsatisfiedLinkError: no opencv_java331 in java.library.path
I can add manually (File->Project Structure->Libraries->+ necessary dir).
My question is: is it possible to specify build.sbt that it will automatically create IntelliJ project with specified library?
I would say try to: drag and drop the dependency into the /lib which should be in the root directory of your project, if it's not there create it.
Run commands:
sbt reload
sbt update
Lastly you could try something like:
File -> Project Structure -> Modules -> then mark all the modules usually 1 to 3, delete them (don't worry won't delete your files) -> hit the green plus sign and select Import Module -> select root directory of your project and it should then refresh it
If none of these help, I'm out of ideas.

Can't import from CrossType.Pure sbt project in Scala

I'm trying to make Play framework project with Scala.js on frontend and one shared project. My sbt configuration is:
import sbt.Project.projectToRef
lazy val scalaV = "2.11.8"
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.settings(
scalaVersion := scalaV,
libraryDependencies ++= Seq(
"com.mediamath" %%% "scala-json" % "1.0"
),
resolvers += "mmreleases" at "https://artifactory.mediamath.com/artifactory/libs-release-global",
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
)
// set up settings specific to the JS project
.jsConfigure(_ enablePlugins ScalaJSPlay)
lazy val sharedJVM = shared.jvm.settings(name := "sharedJVM")
lazy val sharedJS = shared.js.settings(name := "sharedJS")
lazy val root = (project in file(".")).settings(
scalaVersion := scalaV,
scalaJSProjects := jsProjects,
pipelineStages := Seq(scalaJSProd, gzip),
routesGenerator := InjectedRoutesGenerator,
scalikejdbcSettings,
libraryDependencies ++= Seq(
jdbc,
cache,
ws,
evolutions,
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test,
"mysql" % "mysql-connector-java" % "5.1.39",
"com.vmunier" % "play-scalajs-scripts_2.11" % "0.5.0"
),
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
).
enablePlugins(PlayScala).
aggregate(jsProjects.map(projectToRef): _*)
lazy val jsProjects = Seq(js)
lazy val js = (project in file("client")).settings(
scalaVersion := scalaV,
persistLauncher := true,
persistLauncher in Test := false,
autoCompilerPlugins := true,
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature"),
libraryDependencies ++= Seq(
"org.scala-js" %%% "scalajs-dom" % "0.9.0",
"com.mediamath" %%% "scala-json" % "1.0"
),
resolvers += "mmreleases" at "https://artifactory.mediamath.com/artifactory/libs-release-global",
resolvers += Resolver.sonatypeRepo("releases"),
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
).enablePlugins(ScalaJSPlugin, ScalaJSPlay)
Everything is working fine but the problem is: I can't import anything from shared project in Scala.js and Play Framework project. Here is how my shared project structure looks:
And here is how I'm trying to import it:
import services.Encryptor
At compile time I got error:
not found: object services [error] import services.Encryptor
How this issue can be fixed?
First of all, never ever (!) do this:
lazy val sharedJVM = shared.jvm.settings(name := "sharedJVM")
lazy val sharedJS = shared.js.settings(name := "sharedJS")
This creates new projects that are picked up by sbt, so the cross project does not hold the right projects anymore. See docs for details.
Instead, use jsSettings and jvmSettings:
(crossProject.crossType(CrossType.Pure) in file("shared"))
// snip
.jsSettings(name := "sharedJS")
.jvmSettings(name := "sharedJVM")
lazy val sharedJVM = shared.jvm
lazy val sharedJS = shared.js
In your build, it seems that your js project does not depend on the shared project. So if course the shared project's contents are not available.
You need to
lazy val js = (project in file("client"))
// snip
.dependsOn(shared.js)

How to add input files in sbt Scala

I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}