is it necessary to add my custom scala library dependencies in new scala project? - scala

I am new to Scala and I am trying to develop a small project which uses a custom library. I have created a mysql connection pool inside the library. Here's my build.sbt for library
organization := "com.learn"
name := "liblearn-scala"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
I have published the same to local ivy repo using sbt publishLocal
Now I have a project which will be making use of the above library with following build.sbt
name := "SBT1"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "com.learn" % "liblearn-scala_2.12" % "0.1"
I am able to compile the new project but when I run it I get
java.lang.ClassNotFoundException: org.apache.tomcat.dbcp.dbcp2.BasicDataSource
But if I add
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
in the project's build.sbt it works without any issues.
Is this the actual way of doing things with scala - sbt? ie : I have to mention dependencies of custom library also inside the project?
Here is my library code (I have just 1 file)
package com.learn.scala.db
import java.sql.Connection
import org.apache.tomcat.dbcp.dbcp2._
object MyMySQL {
private val dbUrl = s"jdbc:mysql://localhost:3306/school?autoReconnect=true"
private val connectionPool = new BasicDataSource()
connectionPool.setUsername("root")
connectionPool.setPassword("xyz")
connectionPool.setDriverClassName("com.mysql.cj.jdbc.Driver")
connectionPool.setUrl(dbUrl)
connectionPool.setInitialSize(3)
def getConnection: Connection = connectionPool.getConnection
}
This is my project code:
try {
val conn = MyMySQL.getConnection
val ps = conn.prepareStatement("select * from school")
val rs = ps.executeQuery()
while (rs.next()) {
print(rs.getString("name"))
print(rs.getString("rank"))
println("----------------------------------")
}
rs.close()
ps.close()
conn.close()
} catch {
case ex: Exception => {
println(ex.printStackTrace())
}
}

By default SBT fetches all project dependencies, transitively. This means it should be necessary to explicitly declare only liblearn-scala, and not also the transitive dependencies mysql-connector-java and tomcat-dbcp. Transitivity can be disabled, and transitive dependencies can be excluded, however unless this has been done explicitly, then it should not be the cause of the problem.
Without seeing your whole build.sbt, I believe you are doing the right thing. If sbt clean publishLocal is not solving the problem, you could try the nuclear option and clear the whole ivy cache (note this will force all projects to re-fetch dependencies).

Related

Can't resolve docker related sbt tags

I'm trying to add sbt-docker to my sbt build of my play website but I'm running into an issue. For some reason none of the docker related stuff on the bottom can resolve.
project/plugins.sbt
logLevel := Level.Warn
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.9")
build.sbt
name := "personal_site"
version := "1.1"
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc , cache , ws , specs2 % Test )
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
dockerfile in docker := {
val targetDir = "/usr/src"
new Dockerfile {
from("flurdy/activator")
//More goes here
}
}
imageNames in docker := Seq(
// Sets the latest tag
ImageName(s"${name.value}:latest"),
// Sets a name with a tag that contains the project version
ImageName(
namespace = None,
repository = name.value,
tag = Some("v" + version.value)
)
)
Here's an image of what it looks like in IntelliJ
I've also tried adding addSbtPlugin("se.marcuslonnberg" % "sbt-docker" % "1.4.0") to my project/plugins.sbt but I get this error about DockerPlugin being imported twice.
~/Sync/Projects/Programming/Personal_Site (master ✘)✹ ᐅ sbt clean
[info] Loading project definition from /home/ryan/Sync/Projects/Programming/Personal_Site/project
/home/ryan/Sync/Projects/Programming/Personal_Site/build.sbt:5: error: reference to DockerPlugin is ambiguous;
it is imported twice in the same scope by
import _root_.sbtdocker.DockerPlugin
and import _root_.com.typesafe.sbt.packager.docker.DockerPlugin
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
Try changing your build.sbt config to this.
lazy val root = (project in file(".")).enablePlugins(sbtdocker.DockerPlugin, PlayScala)
It removes the ambiguity by using the full name to DockerPlugin, since sbt-native-packager uses the same name for its Docker plugin I believe.
Maybe worth raising a Github issue with the author's repo so they can document it in the project docs.

How to package an akka project for a netlogo extension?

I am trying to make a simple NetLogo extension that is based on akka.
However, whenever I try to load the extension in NetLogo, I get the error:
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
Which obviously means that some configuration is missing. I then proceded to add reference.conf to my resources folder but with no luck.
The last thing I tried was to use the sbt-assemblty plugin, but I keep getting the same error. So this is my build.sbt:
name := "TestAkka"
version := "1.0"
scalaVersion := "2.11.7"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalacOptions ++= Seq("-deprecation", "-unchecked", "-Xfatal-warnings",
"-encoding", "us-ascii")
libraryDependencies ++= Seq(
"org.nlogo" % "NetLogo" % "5.3.0" from
"http://ccl.northwestern.edu/devel/NetLogo-5.3-17964bb.jar",
"asm" % "asm-all" % "3.3.1",
"org.picocontainer" % "picocontainer" % "2.13.6",
"com.typesafe" % "config" % "1.3.0",
"com.typesafe.akka" %% "akka-actor" % "2.4.1",
"com.typesafe.akka" %% "akka-remote" % "2.4.1"
)
artifactName := { (_, _, _) => "sample-scala.jar" }
packageOptions := Seq(
Package.ManifestAttributes(
("Extension-Name", "sample-scala"),
("Class-Manager", "main.scala.akkatest.TestClassManager"),
("NetLogo-Extension-API-Version", "5.3")))
packageBin in Compile <<= (packageBin in Compile, baseDirectory, streams) map {
(jar, base, s) =>
IO.copyFile(jar, base / "sample-scala.jar")
Process("pack200 --modification-time=latest --effort=9 --strip-debug " +
"--no-keep-file-order --unknown-attribute=strip " +
"sample-scala.jar.pack.gz sample-scala.jar").!!
if(Process("git diff --quiet --exit-code HEAD").! == 0) {
Process("git archive -o sample-scala.zip --prefix=sample-scala/ HEAD").!!
IO.createDirectory(base / "sample-scala")
IO.copyFile(base / "sample-scala.jar", base / "sample-scala" / "sample-scala.jar")
IO.copyFile(base / "sample-scala.jar.pack.gz", base / "sample-scala" / "sample-scala.jar.pack.gz")
Process("zip sample-scala.zip sample-scala/sample-scala.jar sample-scala/sample-scala.jar.pack.gz").!!
IO.delete(base / "sample-scala")
}
else {
s.log.warn("working tree not clean; no zip archive made")
IO.delete(base / "sample-scala.zip")
}
jar
}
cleanFiles <++= baseDirectory { base =>
Seq(base / "sample-scala.jar",
base / "sample-scala.jar.pack.gz",
base / "sample-scala.zip") }
I have an project/assembly.sbt with the contents:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.1")
I have a assembly.sbt in root with the contents:
import sbtassembly.AssemblyKeys._
baseAssemblySettings
In my scala code I have:
val configString = ConfigFactory.parseString(
"""
akka {
loglevel = "INFO"
actor {
provider = "akka.remote.RemoteActorRefProvider"
}
remote {
enabled-transports = ["akka.remote.netty.tcp"]
netty.tcp {
hostname = "127.0.0.1"
port = "9500"
}
log-sent-messages = on
log-received-messages = on
}
}
""".stripMargin)
val config = ConfigFactory.load(configString)
The resources folder contains an application.conf which I don't use at the moment. Greping the output of jar tf command with the expression "reference", clearly shows that reference.conf is present:
How do I package this akka example for a netlogo extension?
Note: I have included akka-actor and akka-remote as library dependencies. I am using Intellij and SBT 0.13.8 on a OS X platform.
EDIT:
After taking the advice from Ayush, I get the following output from the command sbt assembly, however the same exception is still present:
I think the problem is that while using sbt:assembly the default merge strategy excludes all the reference.conf files. This is what i found in documentation.
If multiple files share the same relative path (e.g. a resource named
application.conf in multiple dependency JARs), the default strategy is
to verify that all candidates have the same contents and error out
otherwise.
Can you try adding a MergeStrategy as follows
assemblyMergeStrategy in assembly := {
case PathList("reference.conf") => MergeStrategy.concat
}
there's an extra trick to solving this with newer akka libraries as akka does not include all their default configurations in the resource.conf
https://stackoverflow.com/a/72325132/1286583

Adding module dependency information in sbt's build.sbt file

I have a multi module project in IntelliJ, as in this screen capture shows, contexProcessor module depends on contextSummary module.
IntelliJ takes care of everything once I setup the dependencies in Project Structure.
However, when I run sbt test with the following setup in build.sbt, I got an error complaining that it can't find the packages in contextSummary module.
name := "contextProcessor"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
How to teach sbt that the missing modules are found?
I could use the build.sbt file in the main root directory.
lazy val root = (project in file(".")).aggregate(contextSummary, contextProcessor)
lazy val contextSummary = project
lazy val contextProcessor = project.dependsOn(contextSummary)
Reference: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html
For testing only one project, I can use project command in sbt.
> sbt
[info] Set current project to root (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> project contextProcessor
[info] Set current project to contextProcessor (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> test
For batch mode as in How to pass command line args to program in SBT 0.13.1?
sbt "project contextProcessor" test
I think a simple build.sbt might not be enough for that.
You would need to create a more sophisticated project/Build.scala like that:
import sbt._
import sbt.Keys._
object Build extends Build {
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(module1, module2)
)
lazy val module1 = Project(
id = "module1",
base = file("module1-folder"),
settings = Seq(
name := "Module 1",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
lazy val module2 = Project(
id = "module2",
base = file("module2-folder"),
dependencies = Seq(module1),
settings = Seq(
name := "Module 2",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
}

How to set up jacoco4sbt to process classes in main and submodules in Play?

I'm having some problems to make jacoco4sbt working with my Play 2.3.4 project.
My project is composed of 3 submodules: common, api and frontend and has no code in the app root folder. Now when I run Jacoco it does not find the submodules classes.
Inspecting target/scala-VERSION/classes I only find some routing classes (which in fact is the only code I have in my "root" project, but I was expecting that because I aggregate all those projects the classes would be there).
If I copy the classes from MODULE_NAME/target/scala-VERSION/classes to target/scala-VERSION/classes and then run Jacoco I get the expected result.
So what is the best way to make it work? I can't find any config in jacoco4sbt to specify additional classes locations.
My build.sbt file
import Keys._
// Dummy value to deal with bug in sbt 0.13.5
val k = 0
name := "PlayApp"
version := "0.5.0"
// omitted resolvers part
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"com.edulify" %% "play-hikaricp" % "1.5.0" exclude("com.jolbox", "bonecp"),
"com.novocode" % "junit-interface" % "0.11" % "test"
)
lazy val common = project.in(file("common")).enablePlugins(PlayJava)
lazy val frontend = project.in(file("frontend")).enablePlugins(PlayJava).dependsOn(common)
lazy val api = project.in(file("api")).enablePlugins(PlayJava).dependsOn(common)
lazy val main = project.in(file(".")).enablePlugins(PlayJava)
.aggregate(frontend, api).dependsOn(frontend, api)
parallelExecution in Test := false
javaOptions in Test += "-Dconfig.resource=test.conf"
jacoco.sbt
import de.johoop.jacoco4sbt._
import JacocoPlugin._
jacoco.settings
Keys.fork in jacoco.Config := true
parallelExecution in jacoco.Config := false
jacoco.outputDirectory in jacoco.Config := file("target/jacoco")
jacoco.reportFormats in jacoco.Config := Seq(XMLReport("utf-8"), HTMLReport("utf-8"))
jacoco.excludes in jacoco.Config := Seq("views*", "*Routes*", "controllers*routes*", "controllers*Reverse*", "controllers*javascript*", "controller*ref*")
javaOptions in jacoco.Config += "-Dconfig.resource=test.conf"
Add jacoco.sbt to every subproject with the following content:
jacoco.settings
p.s. I've been looking for ways to convince sbt to have jacoco.settings applied to every subproject in the top-level root build.sbt, but to no avail.

Factoring libraryDependencies in multi project Build.sbt

I'm trying to write a concise multi project Build.sbt, so I tried to put all library dependencies in root project and then make others depends on it. My Build.sbt looks like the following:
object KataBuild extends Build {
lazy val fizzBuzz = Project(
id = "fizzBuzz",
base = file("fizzBuzz"),
settings = Project.defaultSettings ++ Seq(
name := "fizzBuzz",
version := "1.0",
scalaVersion := "2.10.3"
)
)
lazy val kata = Project(
id = "scala-kata",
base = file("."),
settings = Project.defaultSettings ++ Seq(
name := "scala-kata",
version := "1.0",
scalaVersion := "2.10.3",
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "2.1.0" % "test"
)
)
) aggregate(fizzBuzz)
fizzBuzz dependsOn(kata)
}
But running test from the main project (scala-kata) fails to build test for fizzBuzz. What am I missing?
Your question is similar to this one. In short, fizzBuzz.dependsOn(kata) means that its compile configuration depends on the kata's compile configuration, but you want to link the test configurations.
The 'Per-configuration classpath dependencies' section of the sbt docs show you how you can make a test->test dependency instead.
However, if you are not going to use kata's test sources but are just looking for a way to include Scala-Test in fizzBuzz, just add it explicitly to fizzBuzz's library dependencies, too. You can define a helper value
lazy val scalaTest = "org.scalatest" %% "scalatest" % "2.1.0" % "test"
Then you can add it to be sub project's library dependencies (libraryDependencies += scalaTest).