I have created a project foo_proj with Intellij (using SBT template) and added a module test_mod to it. The abbreviated directory looks like this
foo_proj
├── src
│ └── main
│ └── scala-2.11
│ └── proj_obj.scala
└── test_mod
└── src
└── tmod.scala
The contents of proj_obj.scala are:
package com.base.proj
object proj_obj {
}
If would like to be able to import this object (proj_obj) into the module file tmod.scala, but when I try import com.base.proj, it can't find it.
I am new to Scala, so if I want to use stuff from the project src directory in other project modules, how else should I be structuring things? Or is this an Intellij IDEA configuration that I need to set?
Edit
The contents of the generated build.sbt are
name := "test_proj"
version := "1.0"
scalaVersion := "2.11.6"
to enable "submodules" (aka multiproject), all you need to do is add the following to your build.sbt file (or use a scala file under project dir):
lazy val root = project in file(".")
lazy val testModule = project in file("test_mod") dependsOn(root)
also, you should change test_mod dir structure.
either drop the src dir and put all your sources under the test_mod dir,
or use the sbt convention: src/main/scala or src/test/scala.
Related
I'm trying to define a multi-project build with a consequent number of subprojects:
.
├── build.sbt
├── project/
│ ├── dependencies.scala
│ ├── tasks.scala
│ └── settings.scala
├── lib_1/
│ └── src/
├── ...
└── lib_n/
└── src/
Those subprojects are currently defined in build.sbt:
val outputJarFolder = "/some/path/"
lazy val comonSettings = /* ... */
lazy val lib_1 = (project in file ("lib1")).settings(
name:="LibOne",
commonSettings,
libraryDependencies ++= Seq(scalaTest, jsonLib, scalaXML, commonsIo),
Compile/packageBin/artifactPath := file(outputJarFolder + "lib1.jar")
)
// ... more libs ...
lazy val lib_n = (project in file ("libn")).settings(
name:="LibLast",
commonSettings,
Compile/packageBin/artifactPath := file(outputJarFolder + "libn.jar")
)
.depensOn(lib_2, lib_12)
How can I define those subprojects in another file than build.sbt in order to "unclog" that file? I want to still be able to define them in their lexicographic order (so lazy is a must). I'm working with sbt version 1.2.8 and scala 2.10.
I've tried:
Putting the declaration of those lib_k variables in a scala file and importing it --> sbt says: "classes cannot be lazy".
Putting those declaration in an object (or in a class and instantiate it in build.sbt) --> sbt projects doesn't list any subproject.
sbt documentation mentions it, but doesn't emphasize too much (perhaps to avoid encouragement for too much variation in how builds are defined in the absence of a common convention):
The build definition is described in build.sbt (actually any files named *.sbt) in the project’s base directory.
So you can split your build.sbt file into several separate .sbt files in the root of the project with different names.
I also recommend reading documentation on Organizing the build.
I have a repo with multiple sub-projects, that looks like this:
my-project
├── .idea
├── backend
│ │── build.sbt
│ │── src
│ └── ... other Scala subproject files
├── client
│ │── package.json
│ │── webpack.config.js
│ └── ... other JS subproject files
├── worker
│ └── ... other Python subproject files
├── Makefile
└── docker-compose.yml
Using IntelliJ IDEA Ultimate, I want to have the whole repo opened in the same window, import ./backend as a "subproject", and be able to install all sbt dependencies. How can I do that?
If I open ./backend as a separate project, IntelliJ imports all correctly, defines a ton of libraries and modules from the build.sbt file, and also re-imports them if I change build.sbt.
But for the shared project it won't import anything. If I manually import an sbt project from ./backend in the Project Structure -> Modules, it switches to the root dir anyway and doesn't import libraries. I can get syntax highlighting and autocompletion for the main library and my own files, but the packages from build.sbt are missing.
You can create a dummy root sbt project in the root folder, creating a separate build.sbt there and using backend as a subproject:
lazy val backend = (project in file("backend"))
I am learning Scala with this coursera course task here that provides SBT file. I download its objsets.zip here. Then I unzip it end and enter into it and type sbt and then console. I try to load the file src/main/scala/objsets/TweetSet.scala on commandline but I am getting a lot of errors.
scala> :load src/main/scala/objsets/TweetSet.scala
Loading src/main/scala/objsets/TweetSet.scala...
<console>:1: error: illegal start of definition
package objsets
^
<console>:10: error: not found: value TweetReader
import TweetReader._
^
import common._
defined class Tweet
<console>:2: error: illegal start of statement (no modifiers allowed here)
override def toString: String =
^
the course uses Eclipse Scala IDE but I would like to learn to use Vim for Scala development, my favorite editor. I find Eclipse hard to use. So
How can I load the scala files in the scala interpreter on the commandline under SBT? Does there exist some favourable tools for developing the scala project on a text editor such as Vim without having to leave the editor or commandline themselves?
SBT files and the directory looks like this
$ tree src/
src/
├── main
│ └── scala
│ ├── common
│ │ └── package.scala
│ └── objsets
│ ├── TweetData.scala
│ ├── TweetReader.scala
│ ├── TweetSet.scala
│ └── testing.sc
└── test
└── scala
└── objsets
└── TweetSetSuite.scala
7 directories, 6 files
$ cat build.sbt assignment.sbt
name := course.value + "-" + assignment.value
scalaVersion := "2.11.7"
scalacOptions ++= Seq("-deprecation")
// grading libraries
libraryDependencies += "junit" % "junit" % "4.10" % Test
// for funsets
libraryDependencies += "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4"
// include the common dir
commonSourcePackages += "common"
courseId := "bRPXgjY9EeW6RApRXdjJPw"
course := "progfun1"
assignment := "objsets"
assignmentInfo := AssignmentInfo(
key = "6PTXvD99EeWAiCIAC7Pj9w",
itemId = "d1FGp",
premiumItemId = Some("Ogg05"),
partId = "7hlkb",
styleSheet = Some((_: File) / "scalastyle" / "scalastyle_config.xml")
)
:load copies the contents of a file into the REPL line by line. That means that you end up trying to define a package (which is not allowed in the REPL), and then you try to import things that aren't visible, etc. If you use :load on a file that has a format useable by the REPL, it will work. In most cases, this means replacing the package line(s) with imports.
There's no need to use :load anyway. sbt console will place you in a REPL that has the project on its classpath. sbt consoleQuick will place you in a REPL that only has the dependencies on the classpath.
For your second question, you are meant to use sbt as a background process. In your terminal emulator, you'll have one tab running vim on your files, and in the other tab, you'll have sbt. In the tab with sbt, you can run ~compile, which recompiles your code every time you save a file in Vim. This replicates how IDEs show compiler errors/warnings as you type.
How can I accumulate all the discoveredMainClasses of a project, along with its dependent sub projects in SBT?
For example, I have a project that looks like
├── bar
│ └── src
│ └── main
│ └── scala
│ └── BarMain.scala
├── build.sbt
├── foo
│ └── src
│ └── main
│ └── scala
│ └── FooMain.scala
├── project
│ └── build.properties
└── root
With one root project that aggregate(foo, bar), I get the following for discoveredMainClasses:
[info] foo/compile:discoveredMainClasses
[info] List(MainFoo)
[info] bar/compile:discoveredMainClasses
[info] List(MainBar)
[info] root/compile:discoveredMainClasses
[info] List()
With one root that only dependsOn(foo, bar) I get
> show discoveredMainClasses
[info] *
How can I have show root/discoveredMainClasses contain both MainFoo and MainBar?
For context, I have other tasks that depend on the output from discoveredMainClasses namely the makeBashScripts in native-packager
The core idea is to create a module that depends on all all the sub modules you want to include and configure all settings on this module.
This results in a build.sbt like this
lazy val root = project.in(file("."))
// package the root module, but not the sub modules
.enablePlugins(JavaAppPackaging)
.settings(
name := "application",
// add the discoveredMainClasses to this project
discoveredMainClasses in Compile ++= (discoveredMainClasses in (client, Compile)).value,
discoveredMainClasses in Compile ++= (discoveredMainClasses in (server, Compile)).value
)
// include these modules in the resulting package
.dependsOn(client, server)
lazy val client = project.in(file("client"))
.settings(
name := "client"
)
lazy val server = project.in(file("server"))
.settings(
name := "server"
)
The (discoveredMainClasses in (client, Compile)).value accesses the discoveredMainClasses from the client project in the Compile scope.
You can build and run your applications with
$ sbt universal:stage
$ ./target/universal/stage/bin/client-app
$ ./target/universal/stage/bin/server-app
A running example can be found here.
cheers,
Muki
An alternative way to #Muki's answer would be to define a ScopeFilter that includes everything but root and accumulate main classes that way. This has the advantage of not having to repeat client, server everywhere.
The resulting build.sbt:
lazy val allCompileButRootFilter =
ScopeFilter(inAggregates(ThisProject, includeRoot = false), inConfigurations(Compile))
lazy val root = project.in(file("."))
.aggregate(client, server)
.enablePlugins(JavaAppPackaging)
.settings(
discoveredMainClasses in Compile ++=
discoveredMainClasses.all(allCompileButRootFilter).value.flatten,
...
)
I have followed the instructions on SBT's documentation for setting up test configurations. I have three test configurations Test, IntegrationTest, and AcceptanceTest. So my src directory looks like this:
src/
acceptance/
scala/
it/
scala/
test/
scala/
My question is, how can I configure SBT to allow sharing of classes between these configurations? Example: I have a class in the "it" configuration for simplifying database setup and tear down. One of my acceptance tests in the "acceptance" configuration could make use of this class. How do I make that "it" class available to the test in "acceptance"
Many thanks in advance.
A configuration can extend another configuration to use that configuration's dependencies and classes. For example, the custom test configuration section shows this definition for the custom configuration:
lazy val FunTest = config("fun") extend(Test)
The extend part means that the compiled normal test sources will be on the classpath for fun sources. In your case, declare the acceptance configuration to extend the it configuration:
lazy val AcceptanceTest = config("acceptance") extend(IntegrationTest)
In case you want to stick with predefined configurations instead of defining new ones, and since both Test and IntegrationTest extend Runtime (one would expect IntegrationTest to extend Test…), you could use the following:
dependencyClasspath in IntegrationTest := (dependencyClasspath in IntegrationTest).value ++ (exportedProducts in Test).value
This should put all the classes you define in Test on the IntegrationTest classpth.
##EDIT:
I was just became aware to amuch better solution thanks to #mjhoy:
lazy val DeepIntegrationTest = IntegrationTest.extend(Test)
An approach is documented here: http://www.scala-sbt.org/release/docs/Detailed-Topics/Testing#additional-test-configurations-with-shared-sources
SBT uses the Maven default directory layout.
It will recognize folders unders src/test/scala to compile along with src/main/scala.
So, if you move the other folders under src/test/scala SBT will compile them and you can share code between them. e.g.:
└── scala
├── acceptance
│ └── scala
│ └── Acceptance.scala
├── it
│ └── scala
│ └── IT.scala
└── Test.scala
Running sbt test will compile all three files in the directory. So, with this Acceptance refer to and can create a new IT class for example.