package name not observed by sbt - scala

I compiled a small Scala example of Spark program called AverageAgeByName.scala
Here is the build.sbt:
$ cd /opt/LearningSparkV2-master/chapter1/main/scala/chapter3
$ vim build.sbt
// Name of the package
name := "main/scala/chapter3"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.1.2",
"org.apache.spark" %% "spark-sql" % "3.1.2"
)
I ran the command:
$ sbt clean package
[info] Updated file /opt/LearningSparkV2 master/chapter1/main/scala/chapter3/project/build.properties: set sbt.version to 1.5.5
[info] welcome to sbt 1.5.5 (Oracle Corporation Java 1.8.0_242)
[info] loading project definition from /opt/LearningSparkV2-master/chapter1/main/scala/chapter3/project
[info] loading settings for project chapter3 from build.sbt ...
[info] set current project to main/scala/chapter3 (in build file:/opt/LearningSparkV2-master/chapter1/main/scala/chapter3/)
[success] Total time: 0 s, completed Aug 4, 2021 10:07:27 AM
[info] compiling 1 Scala source to /opt/LearningSparkV2-master/chapter1/main/scala/chapter3/target/scala-2.12/classes ...
[success] Total time: 9 s, completed Aug 4, 2021 10:07:36 AM
The resulting class was not placed in proper directory hierarchy according to the package name in build.sbt:
$ ls /opt/LearningSparkV2-master/chapter1/main/scala/chapter3/target/scala-2.12/classes
'AverageAgeByName$$typecreator1$1.class' 'AverageAgeByName$.class' AverageAgeByName.class
It's flat. I expect the class should be placed in /opt/LearningSparkV2-master/chapter1/main/scala/chapter3/target/scala-2.12/classes/main/scala/chapter3
Where did I get it wrong?

The name := "main/scala/chapter3" in the build.sbt has nothing to do with a package or a destination folder: it's the name of the project which will be used when packaging your project as a JAR for instance.
The folder in which the classes are generated is driven by the package you set in your Scala file AverageAgeByName.scala.
The following would generate class file in target/scala-2/classes/xxx/yyy:
package xxx.yyy
class AverageAgeByName {}
Also note that usually source files are also put in a directory structure matching the package you set. That is in src/main/scala/xxx/yyy in the example above.
And last, you should probably not care at all of where the class files are generated.

Related

SBT how to add unmanaged jars to subproject tests?

I have an SBT multiproject structure, but for some reason sbt does not recognize the unmanaged jar when tests are run. The objective is to have a stub subproject without an actual implementation and add this as a "Provided" dependency to the Main class, it should detect the jars in the /lib folder when provided. The actual implementation in the .jar file simply prints a random string.
Here is the project structure.
./MainClass
./MainClass/lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar
./MainClass/src/test/scala/Main.scala
./MainClass/src/main/scala/Main.scala
./stub/lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar
./stub/src/main/scala/test/stub/Example.scala
./build.sbt
File main/scala/Main.scala
import test.stub.Example
object Main extends App {
Example.printSomething()
}
File test/scala/Main.scala
import org.scalatest.flatspec._
import test.stub.Example
class Main extends AnyFlatSpec {
"Unmanaged jar" should "work" in {
Example.printSomething()
}
}
scala/test/stub/Example.scala:
package test.stub
object Example {
def printSomething(): Unit = ???
}
build.sbt:
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.10"
lazy val root = (project in file("."))
.settings(
name := "stubExample"
).aggregate(MainClass, stub)
lazy val stub = project
.in(file("stub"))
.settings(
name := "stub"
)
lazy val MainClass = project
.in(file("MainClass"))
.settings(
name := "MainClass",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.14" % Test,
).dependsOn(`stub` % Provided)
When I run sbt "MainClass/runMain Main" it finds the unmanaged jars as expected and prints the string, but when I run sbt "MainClass/test" I get the following error:
[info] Main:
[info] Unmanaged jar
[info] - should work *** FAILED ***
[info] scala.NotImplementedError: an implementation is missing
Does anyone understand why the jars are not found?
*** EDIT ***
Here is the output of sbt "MainClass/runMain Main"
▶ sbt "MainClass/runMain Main"
[info] welcome to sbt 1.6.2 (Azul Systems, Inc. Java 1.8.0_312)
[info] loading global plugins from /Users/riccardo/.sbt/1.0/plugins
[info] loading project definition from /Users/riccardo/Documents/Projects/stubExample/project
[info] loading settings for project root from build.sbt ...
[info] set current project to stubExample (in build file:/Users/riccardo/Documents/Projects/stubExample/)
[info] running Main
Printed Something
The .jar has an implementation of test.stub.Example#printSomething (It is a different project)
package test.stub
object Example {
def printSomething(): Unit = println("Printed Something")
}
My expectation is that since the stub subproject is marked as Provided it should use the printSomething implementation from the jar in the classpath instead of the stub subproject, the stub subproject was marked as Provided and would expect it to be excluded from the classpath (maybe I don't understand how Provided works).
This behavior happens when running the main class as shown above, it uses the printSomething from the classpath in lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar instead of using the stub that is not implemented. In my real life scenario, I need the dependency on the stub subproject.
If I remove the dependsOn the sbt test step works, but the project wouldn't compile without the unmanaged jars in the /lib folder, hence why we use a stub subproject as dependecy, just to make the project compile.

SBT not finding resolvers added in another sbt file

I have a SBT project, where I have added the resolvers in build.sbt. Now, instead of adding the resolvers in build.sbt file, I am trying to put in a new file, resolvers.sbt.
But the SBT is unable to find the artifacts if I make the resolvers in separate file. However, I can see the message while starting up the SBT that, my resolvers.sbt is being considered.
If I add the resolvers to global file in .sbt directory, it is getting resolved.
sbt version : 1.2.6
Did anyone else faced the same issue ?
build.sbt
import sbt.Credentials
name := "sbt-sample"
version := "0.1"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.reactore" %% "reactore-infra" % "1.0.0.0-DEV-SNAPSHOT"
)
resolvers.sbt
credentials += Credentials("Artifactory Realm", "192.168.1.120", "yadu", "password")
resolvers ++= Seq(
"Reactore-snapshots" at "http://192.168.1.120:8182/artifactory/libs-snapshot-local"
)
SBT Log
sbt:sbt-sample> reload
[info] Loading settings for project global-plugins from idea.sbt ...
[info] Loading global plugins from /home/administrator/.sbt/1.0/plugins
[info] Loading settings for project sbt-sample-build from resolvers.sbt ...
[info] Loading project definition from /home/administrator/source/poc/sbt-sample/project
[info] Loading settings for project sbt-sample from build.sbt ...
[info] Set current project to sbt-sample (in build file:/home/administrator/source/poc/sbt-sample/)
I am answering my own question here, hoping that it will be useful for someone else.
Found out the issue, I was keeping the resolvers.sbt inside project directory. I moved it to the project's home directory (where build.sbt is present), and now it is resolving.

Specs2 unable to find my tests using sbt

I have build a simple project for Spec2 testing using simple sbt.
package main.specs
import org.specs2._
class QuickStartSpec extends Specification {
def is = s2"""
This is my first specification
it is working $ok
really working! $ok
"""
}
And here is my build.sbt file:
name := "QuickStartSpec"
version := "1.0"
scalaVersion := "2.10.1"
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "3.6.5" % "test")
scalacOptions in Test ++= Seq("-Yrangepos")
But when I run this command in sbt
testOnly main.specs.QuickStartSpec
I am getting this:
[info] Updating {file:/Users/nabajeet/workspace/SpecsTest/}specstest...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for test:testOnly
I am following this page to create the example:
https://etorreborre.github.io/specs2/website/SPECS2-3.6.5/quickstart.html
I am unable to figure out the reason why my tests are not detected.
My sbt version in 0.13.8
By declaring
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "3.6.5" % "test")
You restrict the scope of specs2 to only classes in the test source directories. You won't be able to reference specs2 classes in the production code (all the code under src/main/)
In your comment you indicate that you placed your spec in /Users/nabajeet/workspace/SpecsTest/src/main/specs/quickStartSpec.scala
Try moving your file to /Users/nabajeet/workspace/SpecsTest/src/test/scala/specs/quickStartSpec.scala
The incorrect location is why it you spec not picked up by SBT (and I feel confident to say that it doesn't compile either).
By default, SBT applies maven's standard directory layout adding src/main/scala/ and src/test/scala/ for scala code. This is documented in the SBT tutorial
I just created a project with the following layout
.
./built.sbt
./src
./src/test
./src/test/scala
./src/test/scala/QuickStartSpec.scala
build.sbt contains
name := "QuickStartSpec"
version := "1.0"
scalaVersion := "2.11.4"
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "3.6.5" % "test")
scalacOptions in Test ++= Seq("-Yrangepos")
and QuickStartSpec.scala contains
package main.specs
import org.specs2._
class QuickStartSpec extends Specification {
def is = s2"""
This is my first specification
it is working $ok
really working! $ok
"""
}
here is the sbt output I get
sbt
[info] Set current project to QuickStartSpec (in build file:/tmp/stack/)
> test:compile
[info] Updating {file:/tmp/stack/}stack...
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Compiling 1 Scala source to /tmp/stack/target/scala-2.11/test-classes...
[info] 'compiler-interface' not yet compiled for Scala 2.11.4. Compiling...
[info] Compilation completed in 6.372 s
[success] Total time: 9 s, completed 27 nov. 2015 06:38:26
> test
[info] QuickStartSpec
[info] + This is my first specification
[info] it is working
[info] + really working!
[info]
[info] Total for specification QuickStartSpec
[info] Finished in 17 ms
[info] 2 examples, 0 failure, 0 error
[info]
[info] Passed: Total 2, Failed 0, Errors 0, Passed 2
[success] Total time: 1 s, completed 27 nov. 2015 06:38:31
>

Modifying and Building Spark core

I am trying to make a modification to the Apache Spark source code. I created a new method and added it to the RDD.scala file within the Spark source code I downloaded. After making the modification to RDD.scala, I built Spark using
mvn -Dhadoop.version=2.2.0 -DskipTests clean package
I then created a sample Scala Spark Application as mentioned here
I tried using the new function I created, and I got a compilation error when using sbt to create a jar for Spark. How exactly do I compile Spark with my modification and attach the modified jar to my project? The file I modified is RDD.scala within the core project. I run sbt package from the root dir of my Spark Application Project.
Here is the sbt file:
name := "N Spark"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.3.0"
Here is the error:
sbt package
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
[info] Set current project to Noah Spark (in build file:/Users/r/Downloads/spark-proj/n-spark/)
[info] Updating {file:/Users/r/Downloads/spark-proj/n-spark/}n-spark...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/r/Downloads/spark-proj/n-spark/target/scala-2.11/classes...
[error] /Users/r/Downloads/spark-proj/n-spark/src/main/scala/SimpleApp.scala:11: value reducePrime is not a member of org.apache.spark.rdd.RDD[Int]
[error] logData.reducePrime(_+_);
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 24 s, completed Apr 11, 2015 2:24:03 AM
UPDATE
Here is the updated sbt file
name := "N Spark"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.apache.spark" % "1.3.0"
I get the following error for this file:
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
/Users/Raggy/Downloads/spark-proj/noah-spark/simple.sbt:7: error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "org.apache.spark" % "1.3.0"
Delete libraryDependencies from build.sbt and just copy the custom-built Spark jar to the lib directory in your application project.

What does % at the end of "version := "0.1-SNAPSHOT"%" mean?

I see in a build.sbt file:
organization := "wfdf23"
name := "default"
version := "0.1-SNAPSHOT"%
There is a % at the end of the version line. What does it mean?
Update: It's generated by the sbt np plugin, and is valid to be loaded by sbt
➜ np-test sbt np
[info] Loading global plugins from /Users/freewind/.sbt/0.13/plugins
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Set current project to np-test (in build file:/private/tmp/Wfdf23/np-test/)
[info] Generated build file
[info] Generated source directories
[success] Total time: 0 s, completed 2014-9-19 22:05:01
➜ np-test cat build.sbt
organization := "np-test"
name := "default"
version := "0.1-SNAPSHOT"%
➜ np-test sbt about
[info] Loading global plugins from /Users/freewind/.sbt/0.13/plugins
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Set current project to default (in build file:/private/tmp/Wfdf23/np-test/)
[info] This is sbt 0.13.5
[info] The current project is {file:/private/tmp/Wfdf23/np-test/}np-test 0.1-SNAPSHOT
[info] The current project is built against Scala 2.10.4
[info] Available Plugins: sbt.plugins.IvyPlugin, sbt.plugins.JvmPlugin, sbt.plugins.CorePlugin, sbt.plugins.JUnitXmlReportPlugin, np.Plugin, org.sbtidea.SbtIdeaPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.4
➜ np-test cat ~/.sbt/0.13/plugins/np.sbt
resolvers += Resolver.url("sbt-plugin-releases",
url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(
Resolver.ivyStylePatterns)
addSbtPlugin("me.lessis" % "np" % "0.2.0")
tl;dr % at the end of the last lines is the means of zsh informing you about so-called partial lines, i.e. lines with no \n at the end of line, for example when cating a file.
It appears you work under the zsh or oh-my-zsh shell (the prompt looks familiar) so the % in your question are the zsh/oh-my-zsh shell's thingy that don't really appear in the files inside, i.e. when you open the files you'll see there are no newlines at the end of the files in question.
p.s. You'd be surprised how my face looked when I noticed it in my terminal after a day wondering about a solution.