Following the hints of the post explaining the basics of migrating to scalajs and this page about cross-compilations, I decided to add cross compilation to my standalone dependency-free scala library by doing the following changes:
I added a file project/plugins.sbt with the content
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.16")
I added scalaVersion in ThisBuild := "2.11.8" in build.sbt because else just scalaVersion was using 2.10
I also added in the build.sbt the following content to ensure that I can keep the same directory structure, since I don't have any particular files for the JVM or for Javascript:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.crossType(CrossType.Pure).in(file(".")).
settings(version := "0.1").
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
But now, after I published the project locally using sbt publish-local the projects depending on this library do not work anymore, i.e. they don't see the classes that this library was offering and raise errors.
I looked into .ivy2/local/.../foo/0.1/jars and the JAR went from 1MB to 1KB, so the errors make sense.
However, how can I make sure the JVM jar file is compiled correctly?
Further informations
The jar time does not change anymore, it looks like there had been some miscompilation. I deleted the .ivy2 cache, but now sbt publish-local always finishes with success but does not regenerate the files.
Ok I found the solution myself.
I needed to remove the publishLocal := {} from the build, and now all the projects depending on my library work fine.
Related
I have the following build.sbt with two sub projects. Everything compiles and runs fine.
One is a thin scala play project. dataextractor has a lot of util classes I want to call in the play project.
However, the configurarion below still leads to the following compilation error:
[error]
/Users/foo.bar/_vws/com.corp.enablement.scripts/sirf_extract_serve/tools_sirf_server/app/corp/tools/es_result_server/service/SystemInitializer.scala:6:21:
object dataextraction is not a member of package corp.tools [error]
import corp.tools.dataextraction.providers.confluence
This is my first sbt multi-project. Advice would be genuinley appreciated
lazy val tools_dataextractor = (project in file("tools_dataextractor")).settings(
Defaults.itSettings,
libraryDependencies += scalatest % "it,test",
name := "corp_tools_dataextractor",
version := "0.1",
mainClass in Compile := Some("corp.tools.ExtractionStartUp")
)
lazy val tools_sirf_server = (project in file("tools_sirf_server")).settings(
).enablePlugins(PlayScala).dependsOn(tools_dataextractor)
lazy val root = (project in file("."))
.aggregate(tools_dataextractor, tools_sirf_server)
The configuration looks good.
2 possibilities what the problem might be:
You are in a sbt-console and haven't reloaded the console after you changed build.sbt
You work with Intellij and did not reload the sbt projects
If this does not help - adjust your question with the steps you take.
OK, the answer is neophyte error.
I had a build.sbt in the root and a build.sbt in each sub project (which is permissible).
Everything will built fine.... Until I started adding dependencies from one sub-project to another. In this case, the super build.sbt "dependsOn" is ignored and compilation errors occur.
Side note, the main reason for keeping sub-project build.sbt was simply laziness. It took half a day to get everything working in a single build.sbt at the root level. However, it is defintiely worth the effort.
I've been tasked with rewriting an old ant build script to SBT. As it happens, our suite is built up of 3 modules:
A Play 2.3 front-end webserver;
A back-end for retrieving data from various other systems;
A middle module containing some shared classes for database access and business logic.
Below an excerpt of my Build.scala file can be found:
val sharedSettings = Seq(
organization := <organization here>,
version := "1.2.5",
scalaVersion := "2.11.1",
libraryDependencies ++= libraries,
unmanagedJars in Compile ++= baseDirectory.value / "lib",
unmanagedJars in Compile ++= baseDirectory.value / "src",
unmanagedJars in Compile ++= baseDirectory.value / "test"
)
lazy val middle = project.settings(sharedSettings: _*)
lazy val back = project.settings(sharedSettings: _*).dependsOn(middle)
However, when I try to compile the source, I get the following error:
bad symbolic reference to scala.reflect.runtime encountered in class file 'ValueConverter.class'. Cannot access term runtime in package scala.reflect. The current classpath may be missing a definition for scala.reflect.runtime, or ValueConverter.class may have been compiled against a version that's incompatible with the one found on the current classpath.
The source code is organized in the following structure:
back
src
test
lib
middle
src
test
lib
front
src
test
lib
Here each lib folder contains some manually maintained libraries (which is why we want to move to sbt).
Any ideas on how to solve this?
In the end, I gave up on trying to get the compiler to understand the additional libraries. Eventually, I added those dependencies that were available using sbt, to the sbt managed libraries. This apparently works well.
I am new to sbt (using sbt.version=0.13.5) created multiproject build definition as following (build.sbt):
name := "hello-app"
version in ThisBuild := "1.0.0"
organization in ThisBuild := "com.jaksky.hello"
scalaVersion := "2.10.4"
ideaExcludeFolders ++= Seq (
".idea",
".idea_modules"
)
lazy val common = (
Project("common",file("common"))
)
lazy val be_services = (
Project("be-services",file("be-services"))
dependsOn(common)
)
My expectation was that sbt will generate directory layout for the projects (based on the documentation). What happened was that just only top directories were generated (common and be-services) with target folder in it.
I tried it in batch mode sbt compile or in interactive mode - none has generated expected folder structures e.g. /src/{main, test}/{scala, java, resources}.
So either my expectations are wrong or there is some problem in my definition or some speciall setting, plugin etc.
Could some more experienced user clarify that, please?
Thanks
As #vptheron correctly points out, sbt does not generate any project directories, with the exception of the target directory when it produces compiled class files.
You might find that functionality in plugins, e.g. np. Also if you use an IDE such as IntelliJ IDEA, creating a new sbt-based project will initialize a couple of directories (such as src).
I'm experiencing a problem when running latest play framework 2.3.
It compiles just fine, although when I do activator run this error happens:
java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
Full error log
I explicitly tried scalaVersion in every build.sbt file and it is the same.
I tried several things like activator clean, full removal os sbt caches and local repo sbt stuff, updating dependencies to latest version but no success.
I have scala version defined.
My current dependencies are:
I tried with both %% and force _2.11 in the name of the dependency.
Dependency List
Other important files
build.sbt
Common.scala
Dependencies.scala
When I fully clean caches it downloads scala 2.10.4 for no reason:
in this download log it says the sbt need the old scala.
Any idea why is this?
Any ideas?
Firstly, it does not matter which version of Scala is used to generate your build. The build system can use version 2.10.4, and that does not prevent your code from using version 2.11.1.
To look at the issue with your scala version, you should consider that settings added directly in build.sbt are added to the root project, but not to other projects. You can see this with a minimal project such as:
$ cat build.sbt
scalaVersion := "2.11.1"
lazy val subproj = project in (file("subproj"))
Then the output of sbt looks like this:
> show scalaVersion
[info] subproj/*:scalaVersion
[info] 2.10.4
[info] sbttest/*:scalaVersion
[info] 2.11.1
So, how can this be fixed?
We can create a lazy val containing a Seq[Setting[_]], and add it to the sub-project definition with the settings() method.
Here's an example build.sbt which adds the same settings to all projects:
$ cat build.sbt
lazy val root = project.in(file("."))
.aggregate(subproj)
.dependsOn(subproj)
.settings(commonSettings: _*)
lazy val subproj = project.in(file("subproj"))
.settings(commonSettings: _*)
lazy val commonSettings = Seq(
scalaVersion := "2.11.1"
)
The _* is required because settings() is defined as requiring Setting[_]* rather than Seq[Setting[_]], so _* converts between the two types. See also: What does param: _* mean in Scala?
And the output from sbt is:
> show scalaVersion
[info] subproj/*:scalaVersion
[info] 2.11.1
[info] root/*:scalaVersion
[info] 2.11.1
This approach can easily be applied to your build.
You're defining your dependencies in the wrong way.
When using SBT, don't use:
"org.mydep" % "mydep_2.11" % "1.0.0"
Instead use:
"org.mydep" %% "mydep" % "1.0.0"
The %% operator automatically appends the current Scala version of the current build. If you use dependencies built against other Scala versions, you're going to be getting the weird errors.
Also make sure you explicitly specify your Scala version somewhere:
scalaVersion := "2.11.1"
The problem was related to some manually added jars that I discovered in the lib folder, that were causing issues with the dependencies of the project defined in the build.sbt.
The only way to find out was to generate a activator dist and look for similar dependencies with different versions since in the dependency graph there were no conflicts
I need to build a single jar, including dependencies, for one of my sub-projects so that it can be used as a javaagent.
I have a multi-module sbt project and this particular module is the lowest level one (it's also pure Java).
Can I (e.g. with sbt-onejar, sbt-proguard or sbt assembly) override how the lowest level module is packaged?
It looks like these tools are really designed to be a post-publish step, but I really need a (replacement or additional) published artefact to include the dependencies (but only for this one module).
UPDATE: Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
People have been publishing fat JAR using sbt-assembly & sbt-release without issues. Here's a blog article from 2011: Publishing fat jar created by sbt-assembly. It boils down to adding addArtifact(Artifact(projectName, "assembly"), sbtassembly.AssemblyKeys.assembly) to your build.sbt (note that the blog is a little out of date AssemblyKeys is now a member of sbtassembly directly).
For sbt 0.13 and above, I prefer to use build.sbt for multi-projects too, so I'd write it like:
import AssemblyKeys._
lazy val commonSettings = Seq(
version := "0.1-SNAPSHOT",
organization := "com.example",
scalaVersion := "2.10.1"
)
val app = (project in file("app")).
settings(commonSettings: _*).
settings(assemblySettings: _*).
settings(
artifact in (Compile, assembly) ~= { art =>
art.copy(`classifier` = Some("assembly"))
}
).
settings(addArtifact(artifact in (Compile, assembly), assembly).settings: _*)
See Defining custom artifacts:
addArtifact returns a sequence of settings (wrapped in a SettingsDefinition). In a full build configuration, usage looks like:
...
lazy val proj = Project(...)
.settings( addArtifact(...).settings : _* )
...