How to package an akka project for a netlogo extension? - scala

I am trying to make a simple NetLogo extension that is based on akka.
However, whenever I try to load the extension in NetLogo, I get the error:
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
Which obviously means that some configuration is missing. I then proceded to add reference.conf to my resources folder but with no luck.
The last thing I tried was to use the sbt-assemblty plugin, but I keep getting the same error. So this is my build.sbt:
name := "TestAkka"
version := "1.0"
scalaVersion := "2.11.7"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalacOptions ++= Seq("-deprecation", "-unchecked", "-Xfatal-warnings",
"-encoding", "us-ascii")
libraryDependencies ++= Seq(
"org.nlogo" % "NetLogo" % "5.3.0" from
"http://ccl.northwestern.edu/devel/NetLogo-5.3-17964bb.jar",
"asm" % "asm-all" % "3.3.1",
"org.picocontainer" % "picocontainer" % "2.13.6",
"com.typesafe" % "config" % "1.3.0",
"com.typesafe.akka" %% "akka-actor" % "2.4.1",
"com.typesafe.akka" %% "akka-remote" % "2.4.1"
)
artifactName := { (_, _, _) => "sample-scala.jar" }
packageOptions := Seq(
Package.ManifestAttributes(
("Extension-Name", "sample-scala"),
("Class-Manager", "main.scala.akkatest.TestClassManager"),
("NetLogo-Extension-API-Version", "5.3")))
packageBin in Compile <<= (packageBin in Compile, baseDirectory, streams) map {
(jar, base, s) =>
IO.copyFile(jar, base / "sample-scala.jar")
Process("pack200 --modification-time=latest --effort=9 --strip-debug " +
"--no-keep-file-order --unknown-attribute=strip " +
"sample-scala.jar.pack.gz sample-scala.jar").!!
if(Process("git diff --quiet --exit-code HEAD").! == 0) {
Process("git archive -o sample-scala.zip --prefix=sample-scala/ HEAD").!!
IO.createDirectory(base / "sample-scala")
IO.copyFile(base / "sample-scala.jar", base / "sample-scala" / "sample-scala.jar")
IO.copyFile(base / "sample-scala.jar.pack.gz", base / "sample-scala" / "sample-scala.jar.pack.gz")
Process("zip sample-scala.zip sample-scala/sample-scala.jar sample-scala/sample-scala.jar.pack.gz").!!
IO.delete(base / "sample-scala")
}
else {
s.log.warn("working tree not clean; no zip archive made")
IO.delete(base / "sample-scala.zip")
}
jar
}
cleanFiles <++= baseDirectory { base =>
Seq(base / "sample-scala.jar",
base / "sample-scala.jar.pack.gz",
base / "sample-scala.zip") }
I have an project/assembly.sbt with the contents:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.1")
I have a assembly.sbt in root with the contents:
import sbtassembly.AssemblyKeys._
baseAssemblySettings
In my scala code I have:
val configString = ConfigFactory.parseString(
"""
akka {
loglevel = "INFO"
actor {
provider = "akka.remote.RemoteActorRefProvider"
}
remote {
enabled-transports = ["akka.remote.netty.tcp"]
netty.tcp {
hostname = "127.0.0.1"
port = "9500"
}
log-sent-messages = on
log-received-messages = on
}
}
""".stripMargin)
val config = ConfigFactory.load(configString)
The resources folder contains an application.conf which I don't use at the moment. Greping the output of jar tf command with the expression "reference", clearly shows that reference.conf is present:
How do I package this akka example for a netlogo extension?
Note: I have included akka-actor and akka-remote as library dependencies. I am using Intellij and SBT 0.13.8 on a OS X platform.
EDIT:
After taking the advice from Ayush, I get the following output from the command sbt assembly, however the same exception is still present:

I think the problem is that while using sbt:assembly the default merge strategy excludes all the reference.conf files. This is what i found in documentation.
If multiple files share the same relative path (e.g. a resource named
application.conf in multiple dependency JARs), the default strategy is
to verify that all candidates have the same contents and error out
otherwise.
Can you try adding a MergeStrategy as follows
assemblyMergeStrategy in assembly := {
case PathList("reference.conf") => MergeStrategy.concat
}

there's an extra trick to solving this with newer akka libraries as akka does not include all their default configurations in the resource.conf
https://stackoverflow.com/a/72325132/1286583

Related

is it necessary to add my custom scala library dependencies in new scala project?

I am new to Scala and I am trying to develop a small project which uses a custom library. I have created a mysql connection pool inside the library. Here's my build.sbt for library
organization := "com.learn"
name := "liblearn-scala"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
I have published the same to local ivy repo using sbt publishLocal
Now I have a project which will be making use of the above library with following build.sbt
name := "SBT1"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "com.learn" % "liblearn-scala_2.12" % "0.1"
I am able to compile the new project but when I run it I get
java.lang.ClassNotFoundException: org.apache.tomcat.dbcp.dbcp2.BasicDataSource
But if I add
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
in the project's build.sbt it works without any issues.
Is this the actual way of doing things with scala - sbt? ie : I have to mention dependencies of custom library also inside the project?
Here is my library code (I have just 1 file)
package com.learn.scala.db
import java.sql.Connection
import org.apache.tomcat.dbcp.dbcp2._
object MyMySQL {
private val dbUrl = s"jdbc:mysql://localhost:3306/school?autoReconnect=true"
private val connectionPool = new BasicDataSource()
connectionPool.setUsername("root")
connectionPool.setPassword("xyz")
connectionPool.setDriverClassName("com.mysql.cj.jdbc.Driver")
connectionPool.setUrl(dbUrl)
connectionPool.setInitialSize(3)
def getConnection: Connection = connectionPool.getConnection
}
This is my project code:
try {
val conn = MyMySQL.getConnection
val ps = conn.prepareStatement("select * from school")
val rs = ps.executeQuery()
while (rs.next()) {
print(rs.getString("name"))
print(rs.getString("rank"))
println("----------------------------------")
}
rs.close()
ps.close()
conn.close()
} catch {
case ex: Exception => {
println(ex.printStackTrace())
}
}
By default SBT fetches all project dependencies, transitively. This means it should be necessary to explicitly declare only liblearn-scala, and not also the transitive dependencies mysql-connector-java and tomcat-dbcp. Transitivity can be disabled, and transitive dependencies can be excluded, however unless this has been done explicitly, then it should not be the cause of the problem.
Without seeing your whole build.sbt, I believe you are doing the right thing. If sbt clean publishLocal is not solving the problem, you could try the nuclear option and clear the whole ivy cache (note this will force all projects to re-fetch dependencies).

Can't resolve docker related sbt tags

I'm trying to add sbt-docker to my sbt build of my play website but I'm running into an issue. For some reason none of the docker related stuff on the bottom can resolve.
project/plugins.sbt
logLevel := Level.Warn
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.9")
build.sbt
name := "personal_site"
version := "1.1"
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc , cache , ws , specs2 % Test )
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
dockerfile in docker := {
val targetDir = "/usr/src"
new Dockerfile {
from("flurdy/activator")
//More goes here
}
}
imageNames in docker := Seq(
// Sets the latest tag
ImageName(s"${name.value}:latest"),
// Sets a name with a tag that contains the project version
ImageName(
namespace = None,
repository = name.value,
tag = Some("v" + version.value)
)
)
Here's an image of what it looks like in IntelliJ
I've also tried adding addSbtPlugin("se.marcuslonnberg" % "sbt-docker" % "1.4.0") to my project/plugins.sbt but I get this error about DockerPlugin being imported twice.
~/Sync/Projects/Programming/Personal_Site (master ✘)✹ ᐅ sbt clean
[info] Loading project definition from /home/ryan/Sync/Projects/Programming/Personal_Site/project
/home/ryan/Sync/Projects/Programming/Personal_Site/build.sbt:5: error: reference to DockerPlugin is ambiguous;
it is imported twice in the same scope by
import _root_.sbtdocker.DockerPlugin
and import _root_.com.typesafe.sbt.packager.docker.DockerPlugin
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
Try changing your build.sbt config to this.
lazy val root = (project in file(".")).enablePlugins(sbtdocker.DockerPlugin, PlayScala)
It removes the ambiguity by using the full name to DockerPlugin, since sbt-native-packager uses the same name for its Docker plugin I believe.
Maybe worth raising a Github issue with the author's repo so they can document it in the project docs.

Resolving ScalaJSPlugin from Build.scala and plugins.sbt

I'm trying to make "ScalaJSPlugin" work, to be resolved in:
project/Build.scala
object BuildProject extends Build {
..
lazy val scalaRx = Project(id = "ScalaRX", base = file("scalarx")).enablePlugins(ScalaJSPlugin).settings(
version := "0.1",
scalaVersion := "2.11.7",
libraryDependencies ++= scalaRxDependencies
) ...
In my project/plugins.sbt file, I put.
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.8")
It fails with with compilation error in Build.scala trying to resolve "ScalaJSPlugin"
There are links to the repo 1, 2
for now I keep that changes commented.

How to change jetty port for Scalatra

I see in numerous places phrases like:
Changing the port in development Add
port in container.Configuration := 8081
to project/build.scala
But where in build.scala? Here is the vanilla build.scala. It is unclear where that addition should go:
object KeywordsBuild extends Build {
val Organization = "com.blazedb"
..
lazy val project = Project (
"keywords",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
..
libraryDependencies ++= Seq(
"org.scalatra" %% "scalatra" % ScalatraVersion,
..
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided"
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
Wherever I have tried to put it the following error message happens:
[info] Compiling 1 Scala source to /shared/wfdemo/project/target/scala-2.10/sbt-0.13/classes...
/shared/wfdemo/build.sbt:1: error: not found: value port
port in container.Configuration := 8081
The correct way is to add to
build.sbt
So the documentation appears to be incorrect - or at the least misleading.
$cat build.sbt
val myPort = 9090
jetty(port = myPort)
create a file JettyLauncher.scala under /src/main/scala :
import org.eclipse.jetty.server.Server
import org.eclipse.jetty.servlet.{DefaultServlet, ServletContextHandler}
import org.eclipse.jetty.webapp.WebAppContext
import org.scalatra.servlet.ScalatraListener
object JettyLauncher {
def main(args: Array[String]) {
val port = System.getProperty("port","8090").toInt
val server = new Server(port)
val context = new WebAppContext()
context setContextPath "/"
context.setResourceBase("src/main/webapp")
context.addEventListener(new ScalatraListener)
context.addServlet(classOf[DefaultServlet], "/")
server.setHandler(context)
server.start
server.join
}
}
make sure your plugins.sbt under project/ has :
addSbtPlugin("com.typesafe.sbt" % "sbt-twirl" % "1.3.13")
addSbtPlugin("org.scalatra.sbt" % "sbt-scalatra" % "1.0.2")
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "4.0.0")
agains sbt = 0.13.16
do an sbt clean compile assembly package (now that you have an sbt-assembly plugin)
run your jar as follows :
java -Dport=8081 -Dname=sameer -jar /Users/sumit/Documents/repos/inner/paytm-insurance-ml-api/serving-layers/model-serving-movies-cp/target/scala-2.11/model-serving-movies-cp-assembly-0.1.jar
This worked for me :
13:11:18.326 [main] INFO o.e.jetty.server.AbstractConnector - Started ServerConnector#509dbdcf{HTTP/1.1,[http/1.1]}{0.0.0.0:8081}
13:11:18.327 [main] INFO org.eclipse.jetty.server.Server - Started #1312ms

sbt - copy SOME libraryDependencies to output lib folder

Using sbt, I'd like to copy some dependency jars to a lib output folder. If possible, I'd like to use the %provided% keyword, like I can with sbt-assembly.
So given a build.sbt somewhat similar to the following, how do create a task that copies the ark-tweet-nlp but NOt the spark-core dependencies to target/scala-%ver%/lib ?
retrieveManaged := true simply copies everything, which is not what I want.
...
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0" % "provided"
libraryDependencies += "edu.cmu.cs" % "ark-tweet-nlp" % "0.3.2"
retrieveManaged := true
...
You could write a task like this.
build.sbt
val retrieveNotProvided = taskKey[Unit]("Copies non provided and non internal dependencies")
def isInternalOrProvided(conf: String) = conf.contains("-internal") || conf == "provided"
retrieveNotProvided := {
val toCopy = new collection.mutable.HashSet[(File, File)]
val pattern = retrievePattern.value
val output = managedDirectory.value
update.value.retrieve { (conf, mid, art, cached) =>
import org.apache.ivy.core.IvyPatternHelper
val fileName = IvyPatternHelper.substitute(
pattern, mid.organization, mid.name, mid.revision, art.name, art.`type`, art.extension, conf
)
if (!isInternalOrProvided(conf)) toCopy += (cached -> output / fileName)
cached
}
IO.copy(toCopy)
}
You'll have to remove retrieveManaged := true from your build.sbt, because otherwise sbt will trigger the original retrieve function.