SBT remote caching weird invalidation - scala

meta info
sbt=1.8.0 jdk=Amazon.com Inc. Java 1.8.0_342
I created a simple project using sbt new ... with the following directory structure
.
├── main
│   └── scala
│   └── example
│   ├── A.scala
│   ├── B.scala
│   ├── C.scala
│   ├── D.scala
│   ├── E.scala
│   └── Hello.scala
└── test
└── scala
└── example
└── HelloSpec.scala
build.sbt looks like
import Dependencies._
ThisBuild / scalaVersion := "2.13.8"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
ThisBuild / pushRemoteCacheTo := Some(MavenCache("local-cache", file("/tmp/cache")))
ThisBuild / logLevel := Level.Debug
ThisBuild / incOptions := incOptions.value.withApiDebug(true).withRelationsDebug(true)
lazy val root = (project in file("."))
.settings(
name := "helloworldsbt",
libraryDependencies += scalaTest % Test
)
I then performed a
sbt "; compile; pushRemoteCache;"
The cache now looks like
[19:26:51] ➜ helloworldsbt cd /tmp/cache/com/example/helloworldsbt_2.13
[19:27:59] ➜ helloworldsbt_2.13 tree
.
├── 0.0.0-55c7c2b9b2d9068c
│   ├── helloworldsbt_2.13-0.0.0-55c7c2b9b2d9068c-cached-test.jar
│   ├── helloworldsbt_2.13-0.0.0-55c7c2b9b2d9068c-cached-test.jar.md5
│   └── helloworldsbt_2.13-0.0.0-55c7c2b9b2d9068c-cached-test.jar.sha1
└── 0.0.0-f0c3847d849b0a58
├── helloworldsbt_2.13-0.0.0-f0c3847d849b0a58-cached-compile.jar
├── helloworldsbt_2.13-0.0.0-f0c3847d849b0a58-cached-compile.jar.md5
└── helloworldsbt_2.13-0.0.0-f0c3847d849b0a58-cached-compile.jar.sha1
I then copied my project to some other subfolder ( with same name ) and updated just one file in the project
But on performing
sbt "pullRemoteCache; compile;"
I get a full compilation
[info] welcome to sbt 1.8.0 (Amazon.com Inc. Java 1.8.0_342)
[info] loading project definition from /Users/faiz.halde/p/helloworldsbt/project
[info] loading settings for project root from build.sbt ...
[info] set current project to helloworldsbt (in build file:/Users/faiz.halde/p/helloworldsbt/)
[debug] not up to date. inChanged = true, force = false
[debug] Updating ...
[debug] Done updating
[debug] Other repositories:
[debug] Default repositories:
[debug] Using inline dependencies specified in Scala.
[debug] tried file:/tmp/cache/com/example/helloworldsbt_2.13/0.0.0-a920b6ad175a9a8a/helloworldsbt_2.13-0.0.0-a920b6ad175a9a8a-cached-compile.jar
[info] remote cache artifact not found for com.example:helloworldsbt:0.0.0-a920b6ad175a9a8a Some(cached-compile)
[debug] failed to download Vector(com.example#helloworldsbt_2.13;0.0.0-a920b6ad175a9a8a!helloworldsbt_2.13.jar): local-cache
[debug] Using inline dependencies specified in Scala.
[debug] [NOT REQUIRED] com.example#helloworldsbt_2.13;0.0.0-55c7c2b9b2d9068c!helloworldsbt_2.13.jar
[info] remote cache artifact extracted for com.example:helloworldsbt:0.0.0-55c7c2b9b2d9068c Some(cached-test)
[success] Total time: 1 s, completed Dec 24, 2022 7:30:36 PM
[debug] [zinc] IncrementalCompile -----------
[debug] IncrementalCompile.incrementalCompile
[debug] previous = Stamps for: 0 products, 0 sources, 0 libraries
[debug] current source = Set(${BASE}/src/main/scala/example/D.scala, ${BASE}/src/main/scala/example/C.scala, ${BASE}/src/main/scala/example/B.scala, ${BASE}/src/main/scala/example/E.scala, ${BASE}/src/main/scala/example/Hello.scala, ${BASE}/src/main/scala/example/A.scala)
[debug] > initialChanges = InitialChanges(Changes(added = Set(${BASE}/src/main/scala/example/C.scala, ${BASE}/src/main/scala/example/E.scala, ${BASE}/src/main/scala/example/Hello.scala, ${BASE}/src/main/scala/example/D.scala, ${BASE}/src/main/scala/example/A.scala, ${BASE}/src/main/scala/example/B.scala), removed = Set(), changed = Set(), unmodified = ...),Set(),Set(),API Changes: Set())
[debug] Full compilation, no sources in previous analysis.
[debug] all 6 sources are invalidated
[debug] Created transactional ClassFileManager with tempDir = /Users/faiz.halde/p/helloworldsbt/target/scala-2.13/classes.bak
[debug] [inv] Invalidate package objects by inheritance only...
[debug] Initial set of included nodes:
[debug] [inv] Package object invalidations:
[debug] Recompiling all sources: number of invalidated sources > 50.0 percent of all sources
[debug] About to delete class files:
[debug] We backup class files:
[debug] [inv] ********* Pruned:
[debug] [inv] Relations:
[debug] [inv] products: Relation [ ]
[debug] [inv] library deps: Relation [ ]
[debug] [inv] library class names: Relation [ ]
[debug] [inv] internalDependencies:
[debug] [inv] externalDependencies:
[debug] [inv] class names: Relation [ ]
[debug] [inv] used names: UsedNames [ ]
[debug] [inv] product class names: Relation [ ]
[debug] [inv] *********
[debug] compilation cycle 1
[info] compiling 6 Scala sources to /Users/faiz.halde/p/helloworldsbt/target/scala-2.13/classes ...
Have I misunderstood the usage of remote caching?
Full compilation, no sources in previous analysis
Why isn't my remote cache not containing sources?

Related

Define subprojects in another file than build.sbt

I'm trying to define a multi-project build with a consequent number of subprojects:
.
├── build.sbt
├── project/
│ ├── dependencies.scala
│ ├── tasks.scala
│ └── settings.scala
├── lib_1/
│ └── src/
├── ...
└── lib_n/
└── src/
Those subprojects are currently defined in build.sbt:
val outputJarFolder = "/some/path/"
lazy val comonSettings = /* ... */
lazy val lib_1 = (project in file ("lib1")).settings(
name:="LibOne",
commonSettings,
libraryDependencies ++= Seq(scalaTest, jsonLib, scalaXML, commonsIo),
Compile/packageBin/artifactPath := file(outputJarFolder + "lib1.jar")
)
// ... more libs ...
lazy val lib_n = (project in file ("libn")).settings(
name:="LibLast",
commonSettings,
Compile/packageBin/artifactPath := file(outputJarFolder + "libn.jar")
)
.depensOn(lib_2, lib_12)
How can I define those subprojects in another file than build.sbt in order to "unclog" that file? I want to still be able to define them in their lexicographic order (so lazy is a must). I'm working with sbt version 1.2.8 and scala 2.10.
I've tried:
Putting the declaration of those lib_k variables in a scala file and importing it --> sbt says: "classes cannot be lazy".
Putting those declaration in an object (or in a class and instantiate it in build.sbt) --> sbt projects doesn't list any subproject.
sbt documentation mentions it, but doesn't emphasize too much (perhaps to avoid encouragement for too much variation in how builds are defined in the absence of a common convention):
The build definition is described in build.sbt (actually any files named *.sbt) in the project’s base directory.
So you can split your build.sbt file into several separate .sbt files in the root of the project with different names.
I also recommend reading documentation on Organizing the build.

Cannot load grand-child routing from child module

Play Version
PlayScala#2.7.2
JDK (Oracle 1.8.0_72, OpenJDK 1.8.x, Azul Zing)
$ java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (Zulu 8.38.0.13-CA-linux64) (build 1.8.0_212-b04)
OpenJDK 64-Bit Server VM (Zulu 8.38.0.13-CA-linux64) (build 25.212-b04, mixed mode)
Expected Behavior
make your sbt project as 'root'
add sub-project as 'buildA'
add sub-project as 'appA'
buildA include a appA.routes
-> Yay, you can create some builds in one root project !!
Actual Behavior
My child-project (buildA) can NOT load grand-chid-project(appA)' s routing.
$ tree -L 2
.
├── build.sbt
├── build-app
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   ├── application.conf
│   │   ├── logback.xml
│   │   └── routes
├── core
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   └── core.routes
│   ├── src
├── project
   ├── plugins.sbt
build.sbt
lazy val buildApp = project.in(file("build-app"))
lazy val root = project.in(file("."))
build-app/build.sbt
lazy val core = project.in(file("../core")).enablePlugins(PlayScala)
lazy val buildApp = (project in file("."))
.enablePlugins(PlayScala)
.dependsOn(core)
build-app/conf/routes
GET / controllers.app.HomeController.index
-> /core core.Routes
core/conf/core.routes
GET / controllers.core.HomeController.index
$ sbt "project buildApp" compile
[error] /home/sizer/go/src/github.com/sizer/hello-sbt-multiproject/build-app/conf/routes:3:1: not found: value core
[error] -> /core core.Routes
Can NOT load core.routes :sob:
Am I Wrong or Is it correct behaviour??
my project is below.
https://github.com/sizer/hello-sbt-multiproject/tree/playframework_failedExample

How to exclude libraries from different tasks in sbt?

I have the following project structure:
/
├── project
| └── ...
├── src
| └── ...
├── lib
│   ├── prod-lib.jar
| └── test-lib.jar
└── build.sbt
And I need to compile with test-lib.jar for deploying into a testing environment and with prod-lib.jar for deploying into a production environment.
Both of them have the same API for accessing the things I need, so my source code does not have any problem with neither of them, but both have subtle differences on how they implement their execution in the background.
Is there a way to create a "sbt task" (Or maybe anything else) that can ignore one jar or the other, but in both perform the assembly task anyway?
Put your jars in different folders and set unmanagedBase key in Compile and Test scopes correspondingly:
> set unmanagedBase in Compile := baseDirectory.value / "lib-compile"
> show compile:assembly::unmanagedBase
[info] /project/foo/lib-compile
> set unmanagedBase in Test := baseDirectory.value / "lib-test"
> show test:assembly::unmanagedBase
[info] /project/foo/lib-test
But don't forget to call assembly task in the corresponding scope then (compile:assembly or test:assembly), because in Global it's still the default:
> show assembly::unmanagedBase
[info] /project/foo/lib

Accumulate subprojects' `discoveredMainClasses`

How can I accumulate all the discoveredMainClasses of a project, along with its dependent sub projects in SBT?
For example, I have a project that looks like
├── bar
│   └── src
│   └── main
│   └── scala
│   └── BarMain.scala
├── build.sbt
├── foo
│   └── src
│   └── main
│   └── scala
│   └── FooMain.scala
├── project
│   └── build.properties
└── root
With one root project that aggregate(foo, bar), I get the following for discoveredMainClasses:
[info] foo/compile:discoveredMainClasses
[info] List(MainFoo)
[info] bar/compile:discoveredMainClasses
[info] List(MainBar)
[info] root/compile:discoveredMainClasses
[info] List()
With one root that only dependsOn(foo, bar) I get
> show discoveredMainClasses
[info] *
How can I have show root/discoveredMainClasses contain both MainFoo and MainBar?
For context, I have other tasks that depend on the output from discoveredMainClasses namely the makeBashScripts in native-packager
The core idea is to create a module that depends on all all the sub modules you want to include and configure all settings on this module.
This results in a build.sbt like this
lazy val root = project.in(file("."))
// package the root module, but not the sub modules
.enablePlugins(JavaAppPackaging)
.settings(
name := "application",
// add the discoveredMainClasses to this project
discoveredMainClasses in Compile ++= (discoveredMainClasses in (client, Compile)).value,
discoveredMainClasses in Compile ++= (discoveredMainClasses in (server, Compile)).value
)
// include these modules in the resulting package
.dependsOn(client, server)
lazy val client = project.in(file("client"))
.settings(
name := "client"
)
lazy val server = project.in(file("server"))
.settings(
name := "server"
)
The (discoveredMainClasses in (client, Compile)).value accesses the discoveredMainClasses from the client project in the Compile scope.
You can build and run your applications with
$ sbt universal:stage
$ ./target/universal/stage/bin/client-app
$ ./target/universal/stage/bin/server-app
A running example can be found here.
cheers,
Muki
An alternative way to #Muki's answer would be to define a ScopeFilter that includes everything but root and accumulate main classes that way. This has the advantage of not having to repeat client, server everywhere.
The resulting build.sbt:
lazy val allCompileButRootFilter =
ScopeFilter(inAggregates(ThisProject, includeRoot = false), inConfigurations(Compile))
lazy val root = project.in(file("."))
.aggregate(client, server)
.enablePlugins(JavaAppPackaging)
.settings(
discoveredMainClasses in Compile ++=
discoveredMainClasses.all(allCompileButRootFilter).value.flatten,
...
)

sbteclipse doesn't create a Scala project by default?

I just want to create a directory layout for my scala project with sbt and sbteclipse. Following is my sbt file.
import com.typesafe.sbteclipse.plugin.EclipsePlugin.EclipseKeys
name := "BGS"
organization := "com.example"
version := "1.0.0"
scalaVersion := "2.9.2"
scalacOptions ++= Seq("-deprecation")
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
libraryDependencies += "org.scalatest" %% "scalatest" % "1.8" % "test"
libraryDependencies += "junit" % "junit" % "4.10" % "test"
unmanagedBase <<= baseDirectory { base => base / "lib" }
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
In this sbt file, I had to use folling to lines to force creation of Scala directories:
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
Furthermore, after running "eclipse" from sbt console, I imported the project to Eclipse, but I could not create Scala class. Eclipse project icon has "J" letter attached to it indicating it is a Java project :-?
Why does sbt and sbteclipse default to Java?
I am running sbt version 0.12 (latest version as of Nov 2012), scala 2.9.2
For your information, what I am aiming to do is use sbt to create working project with following directory structure:
├── build.sbt
├── lib
│   ├── biojava3-core-3.0.4.jar
├── project
│   ├── plugins.sbt
│   ├── project
│   │   └── target
│   └── target
│   ├── config-classes
│   ├── scala-2.9.2
│   └── streams
├── src
│   ├── main
│   │   ├── java
│   │   ├── resources
│   │   └── scala
│   └── test
│   ├── java
│   ├── resources
│   └── scala
├── target
│   ├── scala-2.9.2
│   │   ├── cache
│   │   ├── classes
│   │   └── test-classes
│   └── streams
│   └── compile
└── test
Since I haven't got any desirable answers so far, I am trending into Typesafe Stack. I removed manually-installed scala and sbt and I am using everything from Typesafe Stack. I will update more when I am done testing with this way.
Here is the final update as I promised:
I followed instruction in this link and it worked perfectly:
http://typesafe.com/resources/typesafe-stack/downloading-installing.html
Detailed information in case somebody want to follow for Mac OSX:
Basically, I first I switched macport to homebrew following instruction here:
http://bitboxer.de/2010/06/03/moving-from-macports-to-homebrew/
Then I did:
brew install scala sbt maven giter8
Then create a Scala project from command line:
g8 typesafehub/scala-sbt
Finally I followed instruction here to add sbteclipse and convert the project to eclipse:
https://github.com/typesafehub/sbteclipse
Everything works as I expect.
Why do you want sbt to create those directors? why not jsut create them yourself. It's a one time event.
You shouldnt need any of the following lines
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
I normally just do mkdir -p src/main/scala whenever I start a new project.