sbteclipse doesn't create a Scala project by default? - scala

I just want to create a directory layout for my scala project with sbt and sbteclipse. Following is my sbt file.
import com.typesafe.sbteclipse.plugin.EclipsePlugin.EclipseKeys
name := "BGS"
organization := "com.example"
version := "1.0.0"
scalaVersion := "2.9.2"
scalacOptions ++= Seq("-deprecation")
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
libraryDependencies += "org.scalatest" %% "scalatest" % "1.8" % "test"
libraryDependencies += "junit" % "junit" % "4.10" % "test"
unmanagedBase <<= baseDirectory { base => base / "lib" }
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
In this sbt file, I had to use folling to lines to force creation of Scala directories:
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
Furthermore, after running "eclipse" from sbt console, I imported the project to Eclipse, but I could not create Scala class. Eclipse project icon has "J" letter attached to it indicating it is a Java project :-?
Why does sbt and sbteclipse default to Java?
I am running sbt version 0.12 (latest version as of Nov 2012), scala 2.9.2
For your information, what I am aiming to do is use sbt to create working project with following directory structure:
├── build.sbt
├── lib
│   ├── biojava3-core-3.0.4.jar
├── project
│   ├── plugins.sbt
│   ├── project
│   │   └── target
│   └── target
│   ├── config-classes
│   ├── scala-2.9.2
│   └── streams
├── src
│   ├── main
│   │   ├── java
│   │   ├── resources
│   │   └── scala
│   └── test
│   ├── java
│   ├── resources
│   └── scala
├── target
│   ├── scala-2.9.2
│   │   ├── cache
│   │   ├── classes
│   │   └── test-classes
│   └── streams
│   └── compile
└── test

Since I haven't got any desirable answers so far, I am trending into Typesafe Stack. I removed manually-installed scala and sbt and I am using everything from Typesafe Stack. I will update more when I am done testing with this way.
Here is the final update as I promised:
I followed instruction in this link and it worked perfectly:
http://typesafe.com/resources/typesafe-stack/downloading-installing.html
Detailed information in case somebody want to follow for Mac OSX:
Basically, I first I switched macport to homebrew following instruction here:
http://bitboxer.de/2010/06/03/moving-from-macports-to-homebrew/
Then I did:
brew install scala sbt maven giter8
Then create a Scala project from command line:
g8 typesafehub/scala-sbt
Finally I followed instruction here to add sbteclipse and convert the project to eclipse:
https://github.com/typesafehub/sbteclipse
Everything works as I expect.

Why do you want sbt to create those directors? why not jsut create them yourself. It's a one time event.
You shouldnt need any of the following lines
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
I normally just do mkdir -p src/main/scala whenever I start a new project.

Related

Scala share resources folder to open a file

This is my folder structure and I am trying to load the "grammar.txt" file from resources folder but I get not found error.
val source = Source.fromResource("grammar.txt")
Folder structure:
➜ cfg-tools tree -L 4
.
├── build.sbt
├── src
│   ├── main
│   │   └── scala
│   │   ├── Builer.scala
│   │   ├── Driver.scala
│   │   ├── tokens.scala
│   │   └── Tools.scala
│   ├── resources
│   │   └── grammar.txt
build.sbt
name := "cfg-tools"
version := "0.1"
scalaVersion := "3.0.2"
Compile / unmanagedResourceDirectories += sourceDirectory.value / "resources"
You don't need the custom SBT configuration: just use the standard place for resources which is src/main/resources (note that it's in main subfolder compared to your current structure).

How to publish Test Only Objects in a sbt project

I have been developing a common library for my team, where I need to provide mock data for end users to write unit-test code. Ideally, the mock object should only be available to tests of packages referencing mine, but I am not sure how to do this.
My package structure is:
├── common
│   ├── src
│   │   ├── main
│   │   │   ├── resources
│   │   │   └── scala
│   │   └── test
│   │   ├── resources
│   │   └── scala
│   │   └── MockData.scala // <--- object defined here
├── build.sbt
In my build.sbt, I have
Test / publishArtifact := true
Test / publish := true
packageBin / publishArtifact := true
And I use sbt clean; sbt compile; sbt publishLocal to publish my library locally.
In the project referencing the above library, I added the following to the build.sbt:
ThisBuild / libraryDependencies ++= Seq(
"org.my" %% "common" % "0.0.1",
"org.my" %% "common" % "0.0.1" % Test,
)
but when writing tests, I cannot find objects defined in MockData.scala.
Please provide some hints, much appreciated.
------------------ UPDATE ------------------
After googling around, I'd decided to write a separate module for publishing test data only. So my package structure becomes:
├── common
│   ├── src
│   │   ├── main
│   │   │   ├── resources
│   │   │   └── scala
│   │   └── test
│   │   ├── resources
│   │   └── scala
├── common-testkit
│   ├── src
│   │   └── main
│   │      ├── resources
│   │      └── scala
│   │   └── MockData.scala // <--- object defined here
├── build.sbt
The issue is in the way you ask to retrieve the test code in your other project.
"org.my" %% "common" % "0.0.1" % Test means to depends on the "main" code of project common when running the tests of your other project. That's what the scope Test (after the version) means.
What you want is to depend on the "test code" of common project when running your tests. This is done by specifying what is called a "classifier" in sbt:
"org.my" %% "common" % "0.0.1" % Test classifier "tests"

how can I interop kotlin code in an existing SBT scala project

I have an existing scala project using SBT which has several modules.
I'd like to start adding new modules in kotlin - I don't require the ability to add kotlin to existing modules (but it could be nice if possible)
I can create new dedicated modules for the new kotlin code if that is a necessity as long as it is possible for the existing scala code to call out to the newly added kotlin modules (visa versa is nice to have but can live without that "kotlin calling scala" if impossible to do that)
Is this feasible and practical thing to do? If possible, how would this be done?
.
├── build.sbt
............
├── Module1ScalaWithJava (EXISTING)
│   ├── src
│   │   ├── main
│   │   │   ├── java
│   │   │   ├── resources
│   │   │   └── scala
├── Module2ScalaOnly (EXISTING)
│   ├── src
│   │   ├── main
│   │   │   └── scala
│   │   └── test
│   │   └── scala
├── NewModuleKotlinOnly (I WANT THIS)
│   ├── src
│   │   ├── main
│   │   │   └── ???KOTLIN????
As mentioned in comments, you can add kotlin module using kotlin-plugin.
Add line to your project/plugins.sbt file (or create it):
addSbtPlugin("com.hanhuy.sbt" % "kotlin-plugin" % "2.0.0")
And you will be able to add kotlin modules to your sbt-project. I would advice you using only build.sbt file for adding modules. I will demonstrate you how to do that below.
I created simple multi-module project with scala and kotlin modules which are depends on each other.
Here my build.sbt:
name := "kotlin-scala"
version := "0.1"
scalaVersion := "2.13.4"
lazy val scalaFirst =
project
.in(file("scala-first"))
lazy val kotlinFirst =
project
.in(file("kotlin-first"))
.settings(
libraryDependencies ++= Seq(
"org.junit.jupiter" % "junit-jupiter-api" % "5.7.0"
).map(_ % Test)
)
lazy val scalaSecond =
project
.in(file("scala-second"))
.dependsOn(kotlinFirst % "compile->compile;test->test")
.settings(
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.3"
).map(_ % Test)
)
lazy val kotlinSecond =
project
.in(file("kotlin-second"))
.dependsOn(scalaFirst % "compile->compile;test->test")
.settings(
libraryDependencies ++= Seq(
"org.junit.jupiter" % "junit-jupiter-api" % "5.7.0"
).map(_ % Test)
)
lazy val kotlinScalaSubmodule =
project
.in(file("kotlin-scala-submodule"))
.dependsOn(kotlinFirst % "compile->compile;test->test")
.dependsOn(scalaFirst % "compile->compile;test->test")
.settings(
libraryDependencies ++= Seq(
"org.junit.jupiter" % "junit-jupiter-api" % "5.7.0"
).map(_ % Test)
)
lazy val scalaKotlinSubmodule =
project
.in(file("scala-kotlin-submodule"))
.dependsOn(scalaFirst % "compile->compile;test->test")
.dependsOn(kotlinFirst % "compile->compile;test->test")
.settings(
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.3"
).map(_ % Test)
)
build.properties contains:
sbt.version = 1.3.2
My project structure:
Here I have some dependencies between modules:
+---------------+ +--------------+
| kotlin-first | | scala-first |
+---------------+ +--------------+
^ \ / ^
| \/ |
+--------------+ || +---------------+
| scala-second | || | kotlin-second |
+--------------+ || +---------------+
||
/ \
+------------------+ +------------------+
| scala-kotlin-sub | | kotlin-scala-sub |
+------------------+ +------------------+
full project you can find on github
Also I write just some unit-tests to demonstrate that it works properly.
Tested on:
java open-jdk 1.8,
scala version 2.13.4.
sbt version 1.3.2
Intellij IDEA build 2020.2.3 with jetBrains scala plugin works properly with this project.

Cannot load grand-child routing from child module

Play Version
PlayScala#2.7.2
JDK (Oracle 1.8.0_72, OpenJDK 1.8.x, Azul Zing)
$ java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (Zulu 8.38.0.13-CA-linux64) (build 1.8.0_212-b04)
OpenJDK 64-Bit Server VM (Zulu 8.38.0.13-CA-linux64) (build 25.212-b04, mixed mode)
Expected Behavior
make your sbt project as 'root'
add sub-project as 'buildA'
add sub-project as 'appA'
buildA include a appA.routes
-> Yay, you can create some builds in one root project !!
Actual Behavior
My child-project (buildA) can NOT load grand-chid-project(appA)' s routing.
$ tree -L 2
.
├── build.sbt
├── build-app
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   ├── application.conf
│   │   ├── logback.xml
│   │   └── routes
├── core
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   └── core.routes
│   ├── src
├── project
   ├── plugins.sbt
build.sbt
lazy val buildApp = project.in(file("build-app"))
lazy val root = project.in(file("."))
build-app/build.sbt
lazy val core = project.in(file("../core")).enablePlugins(PlayScala)
lazy val buildApp = (project in file("."))
.enablePlugins(PlayScala)
.dependsOn(core)
build-app/conf/routes
GET / controllers.app.HomeController.index
-> /core core.Routes
core/conf/core.routes
GET / controllers.core.HomeController.index
$ sbt "project buildApp" compile
[error] /home/sizer/go/src/github.com/sizer/hello-sbt-multiproject/build-app/conf/routes:3:1: not found: value core
[error] -> /core core.Routes
Can NOT load core.routes :sob:
Am I Wrong or Is it correct behaviour??
my project is below.
https://github.com/sizer/hello-sbt-multiproject/tree/playframework_failedExample

Accumulate subprojects' `discoveredMainClasses`

How can I accumulate all the discoveredMainClasses of a project, along with its dependent sub projects in SBT?
For example, I have a project that looks like
├── bar
│   └── src
│   └── main
│   └── scala
│   └── BarMain.scala
├── build.sbt
├── foo
│   └── src
│   └── main
│   └── scala
│   └── FooMain.scala
├── project
│   └── build.properties
└── root
With one root project that aggregate(foo, bar), I get the following for discoveredMainClasses:
[info] foo/compile:discoveredMainClasses
[info] List(MainFoo)
[info] bar/compile:discoveredMainClasses
[info] List(MainBar)
[info] root/compile:discoveredMainClasses
[info] List()
With one root that only dependsOn(foo, bar) I get
> show discoveredMainClasses
[info] *
How can I have show root/discoveredMainClasses contain both MainFoo and MainBar?
For context, I have other tasks that depend on the output from discoveredMainClasses namely the makeBashScripts in native-packager
The core idea is to create a module that depends on all all the sub modules you want to include and configure all settings on this module.
This results in a build.sbt like this
lazy val root = project.in(file("."))
// package the root module, but not the sub modules
.enablePlugins(JavaAppPackaging)
.settings(
name := "application",
// add the discoveredMainClasses to this project
discoveredMainClasses in Compile ++= (discoveredMainClasses in (client, Compile)).value,
discoveredMainClasses in Compile ++= (discoveredMainClasses in (server, Compile)).value
)
// include these modules in the resulting package
.dependsOn(client, server)
lazy val client = project.in(file("client"))
.settings(
name := "client"
)
lazy val server = project.in(file("server"))
.settings(
name := "server"
)
The (discoveredMainClasses in (client, Compile)).value accesses the discoveredMainClasses from the client project in the Compile scope.
You can build and run your applications with
$ sbt universal:stage
$ ./target/universal/stage/bin/client-app
$ ./target/universal/stage/bin/server-app
A running example can be found here.
cheers,
Muki
An alternative way to #Muki's answer would be to define a ScopeFilter that includes everything but root and accumulate main classes that way. This has the advantage of not having to repeat client, server everywhere.
The resulting build.sbt:
lazy val allCompileButRootFilter =
ScopeFilter(inAggregates(ThisProject, includeRoot = false), inConfigurations(Compile))
lazy val root = project.in(file("."))
.aggregate(client, server)
.enablePlugins(JavaAppPackaging)
.settings(
discoveredMainClasses in Compile ++=
discoveredMainClasses.all(allCompileButRootFilter).value.flatten,
...
)