I am in the process of writing a library that does monitoring/OpenTracing and I am attempting to use sbt-aspectj so that users of the library don't need to manually instrument their code. I am currently however getting an issue with creating an sbt-project representing such a library.
The idea is that I want an external library as indicated in this sample here https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/external however that external library is dependant on an external dependency (i.e. akka-actors). Basically I am trying to combine both https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/external and https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/jar. I have created a sample project here https://github.com/mdedetrich/sbt-aspectj-issue to indicate the problem I am having however below is the relevant sample
lazy val root = (project in file("."))
.enablePlugins(SbtAspectj)
.settings(
name := RootName,
version := Version,
// add akka-actor as an aspectj input (find it in the update report)
// aspectjInputs in Aspectj ++= update.value.matching(
// moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*")),
// replace the original akka-actor jar with the instrumented classes in runtime
// fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value,
// only compile the aspects (no weaving)
aspectjCompileOnly in Aspectj := true,
// ignore warnings (we don't have the target classes at this point)
aspectjLintProperties in Aspectj += "invalidAbsoluteTypeName = ignore",
// replace regular products with compiled aspects
products in Compile ++= (products in Aspectj).value,
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion
)
)
lazy val test = (project in file("test"))
.enablePlugins(SbtAspectj)
.settings(
aspectjBinaries in Aspectj ++= update.value.matching(
moduleFilter(organization = Organization, name = s"$RootName*")),
aspectjInputs in Aspectj ++= update.value.matching(
moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*")),
fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value,
// weave this project's classes
aspectjInputs in Aspectj += (aspectjCompiledClasses in Aspectj).value,
products in Compile := (products in Aspectj).value,
products in Runtime := (products in Compile).value,
libraryDependencies ++= Seq(
Organization %% RootName % Version
)
)
The idea is that we publish the root project using root/publishLocal and the test project is just designed to include root as a libraryDependency so we can see if the aspect-j is working properly.
The problem is simple that I am unable to get it working. The current code at https://github.com/mdedetrich/sbt-aspectj-issue publishes with root/publishLocal (not sure if its correct though) however when I then do test/run I get this
[info] Weaving 2 inputs with 1 AspectJ binary to /home/mdedetrich/github/sbt-aspectj-issue/test/target/scala-2.13/aspectj/classes...
[error] stack trace is suppressed; run last test / Compile / packageBin for the full output
[error] (test / Compile / packageBin) java.util.zip.ZipException: duplicate entry: META-INF/MANIFEST.MF
[error] Total time: 1 s, completed Dec 29, 2019 4:31:27 PM
sbt:sbt-aspectj-issue>
Which seems to be an issue with having duplicate akka-actor entries. I tried toggling various entries in build.sbt but didn't manage to get it working.
EDIT: This was also posted as a github issue here https://github.com/sbt/sbt-aspectj/issues/44
Generally, you can exclude the META-INF directories from the external libraries woven.
mappings in (Compile, packageBin) := {
(mappings in (Compile, packageBin)).value
.filterNot(_._2.startsWith("META-INF/"))
}
But for akka libraries, there is another problem. In each akka library, there is a reference.conf, which contains the fallback configuration for the provided features. This will also lead to conflicts like the META-INF did. But it cannot be just excluded like the META-INF, because they are essential for akka to work properly.
If you exclude them, you'll have to provide all the required akka configurations in your application.conf, or a merged (not simply concatenate) reference.conf in your project. It's not trivial, and subject to version change of akka.
Another solution would be weaving and repackaging the akka libraries individually, so the reference.conf can be kept in the repackaged libraries. The project layout and build script will a bit more complicated, but also be easier to maintain if you have plan to upgrade to newer versions of akka in the future.
Related
Following the hints of the post explaining the basics of migrating to scalajs and this page about cross-compilations, I decided to add cross compilation to my standalone dependency-free scala library by doing the following changes:
I added a file project/plugins.sbt with the content
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.16")
I added scalaVersion in ThisBuild := "2.11.8" in build.sbt because else just scalaVersion was using 2.10
I also added in the build.sbt the following content to ensure that I can keep the same directory structure, since I don't have any particular files for the JVM or for Javascript:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.crossType(CrossType.Pure).in(file(".")).
settings(version := "0.1").
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
But now, after I published the project locally using sbt publish-local the projects depending on this library do not work anymore, i.e. they don't see the classes that this library was offering and raise errors.
I looked into .ivy2/local/.../foo/0.1/jars and the JAR went from 1MB to 1KB, so the errors make sense.
However, how can I make sure the JVM jar file is compiled correctly?
Further informations
The jar time does not change anymore, it looks like there had been some miscompilation. I deleted the .ivy2 cache, but now sbt publish-local always finishes with success but does not regenerate the files.
Ok I found the solution myself.
I needed to remove the publishLocal := {} from the build, and now all the projects depending on my library work fine.
I have a CLI tool written in Java which can modify some source with the added params. For example, it can rename an enum value across a whole project.
I want to write an sbt task that can run this tool from my project dir with the given params, like sbt 'enums -rename A B'. My tool can be injected to the project through the sbt dependencies.
I skimmed through the book sbt in Action looking for an answer, but those examples are not this specific.
My build.sbt (far from working):
name := """toolTestWithActivator"""
version := "1.0-SNAPSHOT"
resolvers += "Local Repository" at "file://C:/Users/torcsi/.ivy2/local"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"tool" % "tool_2.11" % "1.0",
javaJdbc,
javaEbean,
cache,
javaWs
)
val mytool = taskKey[String]("mytool")
mytool := {
com.my.tool.Main
}
Can sbt handle this type of task/dependency structure, or do I need to do this another way?
SBT is recursive: it compiles .sbt files and .scala files under the project folder and use those to execute your build (in fact you can see sbt as a library that helps you producing builds).
So, as you need your library to define a task, that one is a dependency of your build.sbt file (and not a dependency of your project).
To declare that the build.sbt file depends on your library, just create a ".sbt" file in the project folder; example:
project/dependencies.sbt
libraryDependencies += "tool" %% "tool" % "1.0"
and in build.sbt add:
val mytool = taskKey[Unit]("mytool")
mytool := {
com.my.tool.main(Array())
}
Some comments:
be careful with the scala version used: as sbt 0.13 is compiled with scala 2.10; your library should also be compiled for scala 2.10 (the package should be tools_2.10 ). And the new sbt 1.0 is compiled with scala 2.12.
I used the %% notation, so that sbt adds by itself the expected scala version.
I supposed your cli tool defines a classic java main method (or the scala equivalent). So, the argument should be an Array of String (here an empty one) and it returns Unit (void in java).
Some reference to understand the solution:
http://www.scala-sbt.org/0.13/docs/Organizing-Build.html
I've been tasked with rewriting an old ant build script to SBT. As it happens, our suite is built up of 3 modules:
A Play 2.3 front-end webserver;
A back-end for retrieving data from various other systems;
A middle module containing some shared classes for database access and business logic.
Below an excerpt of my Build.scala file can be found:
val sharedSettings = Seq(
organization := <organization here>,
version := "1.2.5",
scalaVersion := "2.11.1",
libraryDependencies ++= libraries,
unmanagedJars in Compile ++= baseDirectory.value / "lib",
unmanagedJars in Compile ++= baseDirectory.value / "src",
unmanagedJars in Compile ++= baseDirectory.value / "test"
)
lazy val middle = project.settings(sharedSettings: _*)
lazy val back = project.settings(sharedSettings: _*).dependsOn(middle)
However, when I try to compile the source, I get the following error:
bad symbolic reference to scala.reflect.runtime encountered in class file 'ValueConverter.class'. Cannot access term runtime in package scala.reflect. The current classpath may be missing a definition for scala.reflect.runtime, or ValueConverter.class may have been compiled against a version that's incompatible with the one found on the current classpath.
The source code is organized in the following structure:
back
src
test
lib
middle
src
test
lib
front
src
test
lib
Here each lib folder contains some manually maintained libraries (which is why we want to move to sbt).
Any ideas on how to solve this?
In the end, I gave up on trying to get the compiler to understand the additional libraries. Eventually, I added those dependencies that were available using sbt, to the sbt managed libraries. This apparently works well.
I am new to sbt (using sbt.version=0.13.5) created multiproject build definition as following (build.sbt):
name := "hello-app"
version in ThisBuild := "1.0.0"
organization in ThisBuild := "com.jaksky.hello"
scalaVersion := "2.10.4"
ideaExcludeFolders ++= Seq (
".idea",
".idea_modules"
)
lazy val common = (
Project("common",file("common"))
)
lazy val be_services = (
Project("be-services",file("be-services"))
dependsOn(common)
)
My expectation was that sbt will generate directory layout for the projects (based on the documentation). What happened was that just only top directories were generated (common and be-services) with target folder in it.
I tried it in batch mode sbt compile or in interactive mode - none has generated expected folder structures e.g. /src/{main, test}/{scala, java, resources}.
So either my expectations are wrong or there is some problem in my definition or some speciall setting, plugin etc.
Could some more experienced user clarify that, please?
Thanks
As #vptheron correctly points out, sbt does not generate any project directories, with the exception of the target directory when it produces compiled class files.
You might find that functionality in plugins, e.g. np. Also if you use an IDE such as IntelliJ IDEA, creating a new sbt-based project will initialize a couple of directories (such as src).
I need to build a single jar, including dependencies, for one of my sub-projects so that it can be used as a javaagent.
I have a multi-module sbt project and this particular module is the lowest level one (it's also pure Java).
Can I (e.g. with sbt-onejar, sbt-proguard or sbt assembly) override how the lowest level module is packaged?
It looks like these tools are really designed to be a post-publish step, but I really need a (replacement or additional) published artefact to include the dependencies (but only for this one module).
UPDATE: Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
People have been publishing fat JAR using sbt-assembly & sbt-release without issues. Here's a blog article from 2011: Publishing fat jar created by sbt-assembly. It boils down to adding addArtifact(Artifact(projectName, "assembly"), sbtassembly.AssemblyKeys.assembly) to your build.sbt (note that the blog is a little out of date AssemblyKeys is now a member of sbtassembly directly).
For sbt 0.13 and above, I prefer to use build.sbt for multi-projects too, so I'd write it like:
import AssemblyKeys._
lazy val commonSettings = Seq(
version := "0.1-SNAPSHOT",
organization := "com.example",
scalaVersion := "2.10.1"
)
val app = (project in file("app")).
settings(commonSettings: _*).
settings(assemblySettings: _*).
settings(
artifact in (Compile, assembly) ~= { art =>
art.copy(`classifier` = Some("assembly"))
}
).
settings(addArtifact(artifact in (Compile, assembly), assembly).settings: _*)
See Defining custom artifacts:
addArtifact returns a sequence of settings (wrapped in a SettingsDefinition). In a full build configuration, usage looks like:
...
lazy val proj = Project(...)
.settings( addArtifact(...).settings : _* )
...