Unable to deploy Scala web application - scala

I'm trying to deploy a simple web application built using Play framework with Scala. The application is working fine, when I run the application using sbt run command, however, when I tried to deploy the application, in my local server, using sbt dist command, I'm getting the following message
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/D:/Scala/SomeDomain/SomeProject/target/universal/SomeProject-1.0-SNAPSHOT/lib/com.google.inject.guice-4.2.2.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Oops, cannot start the server.
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error injecting constructor, java.io.IOException: Dictionary directory does not exist: D:\Scala\SomeDomain\SomeProject\target\universal\SomeProject-1.0-SNAPSHOT\bin\dict
at initializer.ServiceInitializer.<init>(ServiceInitializer.scala:11)
at initializer.ApplicationInitializer.configure(ApplicationInitializer.scala:12) (via modules: com.google.inject.util.Modules$OverrideModule -> initializer.ApplicationInitializer)
while locating initializer.ServiceInitializer
I'm using Windows, I extracted the generated .zip file and executed the .bat file from the /bin directory.
build.sbt
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala, LauncherJarPlugin)
scalaVersion := "2.12.10"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test
libraryDependencies += "postgresql" % "postgresql" % "9.1-901-1.jdbc4"
plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.0")
addSbtPlugin("org.foundweekends.giter8" % "sbt-giter8-scaffold" % "0.11.0")

Seems like some component of you application expects a directory to exist:
java.io.IOException: Dictionary directory does not exist: D:\Scala\SomeDomain\SomeProject\target\universal\SomeProject-1.0-SNAPSHOT\bin\dict
Ensure to create the directory, or add the missing files to your dist.

Related

Encounter sbt.librarymanagement.ResolveException

I am fairly new to scala & sbt and encountered an issue when attempting to compile my scala project and have 1 hypothesis as to why i am seeing the below error:
the path to the dependency on my local is configured wrongly. it is currently: "/Users/jutay/.ivy2/localorg.apache.spark/....." and there should be a "/" after "local" which should make the path look like ".../.ivy2/local/org.apache.spark/..."
But I am unsure if this is truly the case?
As i did some research and found the following online examples, which is showing that those who encountered the same sbt.librarymanagement.ResolveException have the similar path designed e.g. ".../.ivy2/localnet.cakesolutions/..." so it seems that the missing "/" in the path is not the one causing the problem?
but in the event that this is indeed the one causing the problem, where can i make the config change to add a "/" after the string "local"?
https://users.scala-lang.org/t/help-with-this-package-downloading-error/8257
sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core:3.0.1
If there is no issue with the path above and it is correct and expected, what am i currently doing wrong or missing leading sbt compile to fail with the above error? It seems to not be able to download this dependency:
Error downloading org.apache.spark:spark-corcle_2.12:3.3.0
Note: I am currently using intelliJ to work on this scala sbt project.
I am kind of stuck on this issue.
[error] stack trace is suppressed; run last update for the full output [error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-corcle_2.12:3.3.0 [error] Not found [error] Not found [error] not found: /Users/jutay/.ivy2/localorg.apache.spark/spark-corcle_2.12/3.3.0/ivys/ivy.xml [error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-corcle_2.12/3.3.0/spark-corcle_2.12-3.3.0.pom [error] Total time: 2 s, completed 28 Nov, 2022 5:27:51 PM
This is my build.sbt file:
name := "proj123"
version := "3.0"
scalaVersion := "2.12.15"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-corcle" % "3.3.0",
"org.apache.spark" %% "spark-sql" % "3.3.0",
"com.google.code.gson" % "gson" % "2.2.4",
"com.google.cloud.spark" %% "spark-bigquery-with-dependencies" % "0.26.0" % "provided",
"com.typesafe" % "config" % "1.3.2",
"com.jcraft" % "jsch" % "0.1.55",
"org.springframework" % "spring-web" % "5.3.18",
"org.apache.httpcomponents" % "httpcore" % "4.4.15",
"org.apache.httpcomponents" % "httpclient" % "4.5.13",
"com.google.code.gson" % "gson" % "2.8.9"
)
I have tried looking up possible solutions online via google related to sbt.librarymanagement.ResolveException but i did not found any solution that would be helpful.
I tried checking my .bash_profile and my sbtconfig.txt as to whether there are any config options there to edit the path (by adding a "/") as a possible way to try and resolve the issue, but there seems to be no such options.
This is my .bash_profile :
# Configuration for node to trust the xxxx Proxy Certificates
export NODE_EXTRA_CA_CERTS='/usr/local/etc/openssl/certs/xxxx_proxy_cacerts.pem'
# Configuration to load nvm - node version manager
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
# >>> coursier install directory >>>
export SPARK_HOME=/Users/jutay/Documents/spark-3.3.0-bin-hadoop3
export JAVA_HOME=/Library/Java/JavaVirtualMachines/zulu-8.jdk/Contents/Home
export SBT_HOME=/Users/jutay/Downloads/sbt
export PATH=$SPARK_HOME/bin:/$JAVA_HOME/bin:/$SBT_HOME/bin:$PATH
# <<< coursier install directory <<<
and this is the sbtconfig.txt inside the sbt folder which my .bash_profile reference:
# sbt configuration file for Windows
# Set the java args
#-mem 1024 was added in sbt.bat as default
#-Xms1024m
#-Xmx1024m
#-Xss4M
#-XX:ReservedCodeCacheSize=128m
# Set the extra sbt options
# -Dsbt.log.format=true

value `versionReconciliation` not found in set

I have an existing Scala with sbt project. Sbt version 1.4.7. I want to make dependencies check more strict according to next article: https://www.scala-lang.org/2019/10/17/dependency-management.html
I've added the next configuration to my build.sbt:
versionReconciliation ++= Seq(
"org.typelevel" %% "cats-core" % "relaxed", // "semver" reconciliation is also available
"*" % "*" % "strict"
)
But got the error: error: not found: value versionReconciliation
My plugins.sbt is empty. sbt installed via Sdkman
As the document you references states, to use this way you need to use sbt-coursier, i.e. you need to add addSbtPlugin("io.get-coursier" % "sbt-coursier" % "2.0.0-RC6-8") to be able to use versionReconciliation. Without the plugin you should be able to use the conflictManager key.

sbt ivy2: configuration 'master' not found from 'compile'

I am getting the following error when I build a library bp-kafka-bp2 that depends on library bp-bp2-componentized:
sbt.librarymanagement.ResolveException: unresolved dependency:
com.foo#bp-bp2-componentized_2.11;3.3.+: configuration not found in
com.foo#bp-bp2-componentized_2.11;3.3.0: 'master'. It was required
from com.foo#bp-kafka-bp2_2.11;3.10.1 compile
The unresolved library bp-bp2-componentized does in fact exist in ~/.ivy2/local and does not have a master configuration in its ivy.xml
My questions are:
Should the dependent library (bp-kafka-bp2) be looking for a master configuration of the (not really) missing library?
If it should not be looking for a master config, what can I do to make it stop doing so?
If it should be looking for a master config, how do I induce the build for the (not really) missing library to produce such a config?
I have tried this in sbt versions 1.1.5 and 1.2.1. I have deleted ~/.ivy2/local, ./ivy2/cache and ~/.sbt. I have also removed all the /target and project/target directories in the library I am building and done sbt clean
This library has built just fine for a year or two now. The only recent change I can think of is introducing 2.11 and 2.12 cross-compilation option -- which is not being exercised here, I'm just building 2.11 version on its own.
The direct dependency is declared in a multi-project build.sbt as
lazy val bp2 = (project in file("bp2")).
settings(commonSettings:_*).
settings(Seq(
name := "bp-kafka-bp2"
):_*).
settings(
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % ScalaTestVersion % "test",
"ch.qos.logback" % "logback-classic" % LogbackVersion % "test",
"com.foo" %% "bp-bp2-componentized" % Constellation.dependency("bp-bp2"),
"com.foo" %% "bp-akka-http" % Constellation.dependency("bp-akka")
)
).
dependsOn(reactiveComponentized)
where Constellation.dependency is just a function that looks up a version by name and turns it into a patch range:
object Constellation {
...
def dependency(name: String) : String = versions(name).split("\\.").dropRight(1).mkString(".") + ".+"
}
You can see from the error message that the version is being found and converted to 3.3.+ which is then resolved correctly to a 3.3.0 in the ivy cache. But then it insists on finding a master configuration which is not present.

sryza/spark-timeseries: NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;

I have a Scala project that I build with sbt. It uses the sryza/spark-timeseries library.
I am trying to run the following simple code:
val tsAirPassengers = new DenseVector(Array(
112.0,118.0,132.0,129.0,121.0,135.0,148.0,148.0,136.0,119.0,104.0,118.0,115.0,126.0,
141.0,135.0,125.0,149.0,170.0,170.0,158.0,133.0,114.0,140.0,145.0,150.0,178.0,163.0,
172.0,178.0,199.0,199.0,184.0,162.0,146.0,166.0,171.0,180.0,193.0,181.0,183.0,218.0,
230.0,242.0,209.0,191.0,172.0,194.0,196.0,196.0,236.0,235.0,229.0,243.0,264.0,272.0,
237.0,211.0,180.0,201.0,204.0,188.0,235.0,227.0,234.0,264.0,302.0,293.0,259.0,229.0,
203.0,229.0,242.0,233.0,267.0,269.0,270.0,315.0,364.0,347.0,312.0,274.0,237.0,278.0,
284.0,277.0,317.0,313.0,318.0,374.0,413.0,405.0,355.0,306.0,271.0,306.0,315.0,301.0,
356.0,348.0,355.0,422.0,465.0,467.0,404.0,347.0,305.0,336.0,340.0,318.0,362.0,348.0,
363.0,435.0,491.0,505.0,404.0,359.0,310.0,337.0,360.0,342.0,406.0,396.0,420.0,472.0,
548.0,559.0,463.0,407.0,362.0,405.0,417.0,391.0,419.0,461.0,472.0,535.0,622.0,606.0,
508.0,461.0,390.0,432.0
))
val period = 12
val model = HoltWinters.fitModel(tsAirPassengers, period, "additive", "BOBYQA")
It builds fine, but when I try to run it, I get this error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
at com.cloudera.sparkts.models.HoltWintersModel.convolve(HoltWinters.scala:252)
at com.cloudera.sparkts.models.HoltWintersModel.initHoltWinters(HoltWinters.scala:277)
at com.cloudera.sparkts.models.HoltWintersModel.getHoltWintersComponents(HoltWinters.scala:190)
.
.
.
The error occurs on this line:
val model = HoltWinters.fitModel(tsAirPassengers, period, "additive", "BOBYQA")
My build.sbt includes:
name := "acme-project"
version := "0.0.1"
scalaVersion := "2.10.5"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-hive" % "1.6.0",
"net.liftweb" %% "lift-json" % "2.5+",
"com.github.seratch" %% "awscala" % "0.3.+",
"org.apache.spark" % "spark-mllib_2.10" % "1.6.2"
)
I have placed sparkts-0.4.0-SNAPSHOT.jar in the lib folder of my project. (I would have preferred to add a libraryDependency, but spark-ts does not appear to be on Maven Central.)
What is causing this run-time error?
The library requires Scala 2.11, not 2.10, and Spark 2.0, not 1.6.2, as you can see from
<scala.minor.version>2.11</scala.minor.version>
<scala.complete.version>${scala.minor.version}.8</scala.complete.version>
<spark.version>2.0.0</spark.version>
in pom.xml. You can try changing these and seeing if it still compiles, find which older version of sparkts is compatible with your versions, or update your project's Scala and Spark versions (don't miss spark-mllib_2.10 in this case).
Also, if you put the jar into lib folder, you also have to put its dependencies there (and their dependencies, etc.) or into libraryDependencies. Instead, publish sparkts into your local repository using mvn install (IIRC) and add it to libraryDependencies, which will allow SBT to resolve its dependencies.

How to determine ScalaJS filename for scoped Javascript dependencies

I'm trying to import angular 2 packages as external dependencies to a ScalaJS project.
For example, the core dep is referenced as #angular/core, and is available on WebJars at "org.webjars.npm" % "angular__core" "2.0.0-rc.6".
When importing in ScalaJS, however, when using this declaration:
jsDependencies += "org.webjars.npm" % "angular__core" % ngVersion / "angular__core.js"
I receive the following error:
[error] (compile:resolvedJSDependencies) org.scalajs.core.tools.jsdep.JSLibResolveException: Some references to JS libraries could not be resolved:
[error] - Missing JS library: angular__core.js
[error] originating from: root:compile
I have determined that the problem is related to the angular__core.js declaration; changing "jquery.js" in "org.webjars" % "jquery" % "1.10.2" / "jquery.js" replicated the error.
How then, should I name the angular dependency? The documentation is not very specific, saying "[the name declaration] include[s] a file ... in the said WebJar when your project is run or tested."
I've tried the following names:
Angular.js
#angular/core.js
angular-core.js
angular.core.js
angular_core.js
angular__core.js
It is the name of the .js file found in the jar. Look at the content of the webjar to determine what is the proper file name to use.