sbt-assembly: Logback does not work with über-JAR - scala

If I run the application within IntelliJ, logging works fine, but if I run the über-JAR, I get the following error:
LF4J: No SLF4J providers were found.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#noProviders for further details.
I use the following configuration to build my über-JAR with sbt-assembly:
lazy val app = (project in file("."))
.settings(
assembly / mainClass := Some("com.example.app.Main"),
assembly / assemblyJarName := "gcm.jar",
assembly / assemblyMergeStrategy := {
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case PathList("reference.conf") => MergeStrategy.concat
case x => MergeStrategy.first
}
)
The dependencies for Logback and scala-logging would be:
ThisBuild / libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.4.0"
ThisBuild / libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.5"
The logback.xml:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration>
<configuration>
<import class="ch.qos.logback.classic.encoder.PatternLayoutEncoder"/>
<import class="ch.qos.logback.core.ConsoleAppender"/>
<import class="ch.qos.logback.core.FileAppender"/>
<appender name="STDOUT" class="ConsoleAppender">
<encoder class="PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{0} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE" class="FileAppender">
<file>gcm.log</file>
<append>true</append>
<immediateFlush>true</immediateFlush>
<encoder class="PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{0} - %msg%n</pattern>
</encoder>
</appender>
<root level="debug">
<appender-ref ref="STDOUT"/>
<appender-ref ref="FILE"/>
</root>
</configuration>
..

This happens because Service Providers (defined in META-INF) are discarded by your merge strategy. I had the same issue when migrating from logback 1.2.x to 1.4.x.
One option I found in sbt-assembly doc is to replace your META-INF case with:
case PathList("META-INF", xs#_*) =>
(xs map {_.toLowerCase}) match {
case "services" :: xs =>
MergeStrategy.filterDistinctLines
case _ => MergeStrategy.discard
}
Is there a way to find out if that is the case - and if - what causes the conflict?
Running in debug mode ./sbt assembly --debug might help identifying which files are discarded:
...
[debug] Merging 'META-INF/services/ch.qos.logback.classic.spi.Configurator' with strategy 'discard'
[debug] Merging 'META-INF/services/jakarta.servlet.ServletContainerInitializer' with strategy 'discard'
[debug] Merging 'META-INF/services/org.slf4j.spi.SLF4JServiceProvider' with strategy 'discard'
...

Related

How can I abbreviate an ActorRef's path when logging in Akka?

When I do a log.info("just a log message"), I get a log string like this:
[INFO] [01/22/2018 18:28:31.950] [s-akka.actor.default-dispatcher-7] [akka://s/user/bob] just a log message
where bob is the name of the actor reference.
I would like to obtain the following instead:
[INFO] [01/22/2018 18:28:31.950] [bob] just a log message
How can I do that? How can I configure akka or the logger to not include all that boilerplate information in the log?
While #elm's answer and #Sarvesh's comment put me in the right track, they did not completely answer my question. So, I am posting here a complete solution.
In build.sbt:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-slf4j" % "2.5.9",
"ch.qos.logback" % "logback-classic" % "1.2.3"
)
In src/main/resources/reference.conf:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "INFO"
}
In src/main/resources/logback.xml:
<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true">
<appender name="consoleAppender" class="ch.qos.logback.core.ConsoleAppender">
<target>System.out</target>
<encoder>
<charset>UTF-8</charset>
<Pattern>[%level] %d{HH:mm:ss.SSS} %message%n%xException{5}</Pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>log/akka.log</file>
<encoder>
<charset>UTF-8</charset>
<pattern>%d %-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="consoleAppender" />
<appender-ref ref="FILE"/>
</root>
</configuration>
Then create ActorRefLogging.scala:
trait ActorRefLogging { this: Actor =>
// if self.toString() is "akka://s/user/bob#1234567"
// then shortName is "bob"
private val shortName = self.toString().split("/").last.split("#").head
private val l = Logging(context.system, this)
object log {
def error(s: String): Unit = l.error(s"[$shortName] $s")
def warning(s: String): Unit = l.warning(s"[$shortName] $s")
def info(s: String): Unit = l.info(s"[$shortName] $s")
def debug(s: String): Unit = l.debug(s"[$shortName] $s")
}
}
Then mix in the trait in the actor and user the logger:
class MyActor extends Actor with ActorRefLogging {
def receive = {
case m => log.info(s"Received: $m")
}
}
In resources/logback.xml update [%logger] to [%logger{0}] for instance as in
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>[%level] %d{yyyy:MM:dd HH:mm:ss.SSS} [%logger{0}] %message%n%xException{5}</pattern>
</encoder>
</appender>

Unresolved dependencies path for SBT project in IntelliJ

I'm using IntelliJ to develop Spark application. I'm following this instruction on how to make intellij work nicely with SBT project.
As my whole team is using IntelliJ so we can just modify build.sbt but we got this unresolved dependencies error
Error:Error while importing SBT project:
[info] Resolving org.apache.thrift#libfb303;0.9.2 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ...
[info] Resolving org.scala-lang#jline;2.10.6 ...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] sparrow-to-orc:sparrow-to-orc_2.10:0.1
[warn] +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT
[trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output.
[error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] Total time: 47 s, completed Jun 10, 2017 8:39:57 AM
And this is my build.sbt
name := "sparrow-to-orc"
version := "0.1"
scalaVersion := "2.11.8"
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0",
"org.apache.spark" %% "spark-hive" % "2.1.0",
"org.apache.spark" %% "spark-streaming" % "2.1.0"
)
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
assemblyMergeStrategy in assembly := {
case PathList("org","aopalliance", xs # _*) => MergeStrategy.last
case PathList("javax", "inject", xs # _*) => MergeStrategy.last
case PathList("javax", "servlet", xs # _*) => MergeStrategy.last
case PathList("javax", "activation", xs # _*) => MergeStrategy.last
case PathList("org", "apache", xs # _*) => MergeStrategy.last
case PathList("com", "google", xs # _*) => MergeStrategy.last
case PathList("com", "esotericsoftware", xs # _*) => MergeStrategy.last
case PathList("com", "codahale", xs # _*) => MergeStrategy.last
case PathList("com", "yammer", xs # _*) => MergeStrategy.last
case "about.html" => MergeStrategy.rename
case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
case "META-INF/mailcap" => MergeStrategy.last
case "META-INF/mimetypes.default" => MergeStrategy.last
case "plugin.properties" => MergeStrategy.last
case "log4j.properties" => MergeStrategy.last
case "overview.html" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
If I don't have this line then the program works fine
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
But then i won't be able to run the application inside IntelliJ as spark dependencies won't be included in the classpath.
I had the same issue. The solution is to set the Scala version in the mainRunner to be the same as the one declared at the top of the build.sbt file:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile"),
scalaVersion := "2.11.8"
)
Good luck!

How to prevent SBT to include test dependencies into the POM

I have a small utilities scala build with test classes under a dedicated test folder. Compiling and then publish-local creates the package in my local repository.
As expected, the test folder is automatically excluded from the local jar of the utilities package.
However, the resulting POM still contains the related dependencies as defined in the sbt. The SBT dependencies:
libraryDependencies ++= Seq(
"org.scalactic" %% "scalactic" % "3.0.0" % Test,
"org.scalatest" %% "scalatest" % "3.0.0" % Test
)
The segment of the POM:
<dependency>
<groupId>org.scalactic</groupId>
<artifactId>scalactic_2.11</artifactId>
<version>3.0.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.0.0</version>
<scope>test</scope>
</dependency>
The scope clearly needs to be test in order to prevent issues in another project (main) that uses this library. In particular, the testing of the main project otherwise includes these test libraries, which causes version conflicts etc.
As these dependencies are only for the not included test package, having them listed in the POM seems silly. How do I tell SBT to not include these test scope dependencies into the final POM?
There was a similar question asked here: sbt - exclude certain dependency only during publish.
Riffing on the answer provided by lyomi, here's how you can exclude all <dependency> elements that contains a child <scope> element, including test and provided.
import scala.xml.{Node => XmlNode, NodeSeq => XmlNodeSeq, _}
import scala.xml.transform.{RewriteRule, RuleTransformer}
// skip dependency elements with a scope
pomPostProcess := { (node: XmlNode) =>
new RuleTransformer(new RewriteRule {
override def transform(node: XmlNode): XmlNodeSeq = node match {
case e: Elem if e.label == "dependency"
&& e.child.exists(child => child.label == "scope") =>
def txt(label: String): String = "\"" + e.child.filter(_.label == label).flatMap(_.text).mkString + "\""
Comment(s""" scoped dependency ${txt("groupId")} % ${txt("artifactId")} % ${txt("version")} % ${txt("scope")} has been omitted """)
case _ => node
}
}).transform(node).head
}
This should generate a POM that looks like this:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.5</version>
</dependency>
<!-- scoped dependency "org.scalatest" % "scalatest_2.12" % "3.0.5" % "test" has been omitted -->
</dependencies>

How to get sbt-assembly merge right?

In our Scala/Scalatra project, we have this merge policy for the plugin sbt-assembly:
assemblyMergeStrategy in assembly := {
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
[error] 11 errors were encountered during merge
java.lang.RuntimeException: deduplicate: different file contents found in the following:
~/.ivy2/cache/org.scalatra/scalatra_2.11/jars/scalatra_2.11-2.3.1.jar:mime.types
~/.ivy2/cache/com.amazonaws/aws-java-sdk-s3/jars/aws-java-sdk-s3-1.10.1.jar:mime.types
deduplicate: different file contents found in the following:
~/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.8.3.jar:org/apache/commons/collections/ArrayStack.class
~/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.1.jar:org/apache/commons/collections/ArrayStack.class
deduplicate: different file contents found in the following:
and the same error for different class names
What would be the right merge logic here?
Versions:
Scala : 2.11.7
SBT : 0.13.9
sbt-assembly: 0.13.0
I do not think, it is a matter of "merge-strategy", but more of the libs and their dependencies you are using.
Who "pulls" these dependencies? Which libraries are you using concretely?
One way to restrict is to use excludeAll (and similar) with dependency declaration. (see library management in SBT ), e.g.
excludeAll(
ExclusionRule("commons-beanutils", "commons-beanutils-core"),
ExclusionRule("commons-collections", "commons-collections"),
ExclusionRule("commons-logging", "commons-logging"),
ExclusionRule("org.slf4j", "slf4j-log4j12"),
ExclusionRule("org.hamcrest", "hamcrest-core"),
ExclusionRule("junit", "junit"),
ExclusionRule("org.jboss.netty", "netty"),
ExclusionRule("com.esotericsoftware.minlog", "minlog")
)
My original issue was solved with:
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", "commons", "collections", xs # _*) =>
MergeStrategy.last
case PathList("mime.types") =>
MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}

Lift RestHelper don't receive my request

I tried to create a simple RestHelper "Hello world" example but I'm having trouble. When I start my container with container:start command, the
serve {
case Nil Get _ => Extraction.decompose("Hello Restful world!")
}
is not invoked in my RestHelper extension. I get the following message:
"The Requested URL / was not found on this server"
So, it seems that for some reason lift ignores
LiftRules.statelessDispatch.append(Service)
line in bootstrap.Boot.boot definition. And I have absolutely no clue why it happens.
Here's my Boot class:
package bootstrap
import net.liftweb.http.LiftRules
import com.yac.restfultest.Service
class Boot {
def boot {
LiftRules.statelessDispatch.append(Service)
}
}
And here's Service:
package com.yac.restfultest
import net.liftweb.http.rest.RestHelper
import net.liftweb.json.Extraction
object Service extends RestHelper {
serve {
case Nil Get _ => Extraction.decompose("Hello Restful world!")
}
}
and in case it helps here's my web.xml:
<!DOCTYPE web-app SYSTEM "http://java.sun.com/dtd/web-app_2_3.dtd">
<web-app>
<filter>
<filter-name>LiftFilter</filter-name>
<display-name>Lift Filter</display-name>
<description>The Filter that intercepts Lift calls</description>
<filter-class>net.liftweb.http.LiftFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>LiftFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
</web-app>
and in case all of the above is not enough here is my build.sbt:
name := "TestRest"
version := "1.0"
scalaVersion := "2.11.6"
resolvers ++= Seq("snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"releases" at "https://oss.sonatype.org/content/repositories/releases"
)
jetty()
libraryDependencies ++= {
val liftVersion = "2.6.2"
Seq(
"net.liftweb" %% "lift-webkit" % liftVersion % "compile"
)
}
As you can see it's almost the most minimalistic lift project setup possible. Still I can't get it working. Any help would be appreciated.
And here's sbt log on container:start:
[info] Compiling 1 Scala source to /home/yac/IdeaProjects/TestRest/target/scala-2.11/classes...
[info] Packaging /home/yac/IdeaProjects/TestRest/target/scala-2.11/testrest_2.11-1.0.jar ...
[info] Done packaging.
[info] starting server ...
[success] Total time: 2 s, completed Apr 22, 2015 7:51:25 PM
> 2015-04-22 19:51:25.640:INFO::main: Logging initialized #44ms
2015-04-22 19:51:25.646:INFO:oejr.Runner:main: Runner
2015-04-22 19:51:25.726:INFO:oejs.Server:main: jetty-9.2.1.v20140609
2015-04-22 19:51:29.818:WARN:oeja.AnnotationConfiguration:main: ServletContainerInitializers: detected. Class hierarchy: empty
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2015-04-22 19:51:30.405:INFO:oejsh.ContextHandler:main: Started o.e.j.w.WebAppContext#32e377c7{/,file:/home/yac/IdeaProjects/TestRest/target/webapp/,AVAILABLE}{file:/home/yac/IdeaProjects/TestRest/target/webapp/}
2015-04-22 19:51:30.406:WARN:oejsh.RequestLogHandler:main: !RequestLog
2015-04-22 19:51:30.417:INFO:oejs.ServerConnector:main: Started ServerConnector#7a601e4{HTTP/1.1}{0.0.0.0:8080}
2015-04-22 19:51:30.418:INFO:oejs.Server:main: Started #4848ms
So, turns out it's a usual noobish non-attentiveness. The package, containing Boot.scala should be called bootstrap.liftweb and not just bootstrap as in my case.
And yes, as suggested by jcern it should be "index" :: Nil in routing pattern.