How IntelliJ (or sbt-structure) identifies test sources? - scala

Background
I've created template-scala-project which, among other amenities, defines configurations FunctionalTest and AcceptanceTest (in addition to IntegrationTest, which is provided by SBT).
This way, the relevant bits of the directory structure are:
src/main/scala - compile sources - Compile
src/test/scala - test sources - Test
src/it/scala - it sources - IntegrationTest
src/ft/scala - ft sources - FunctionalTest
src/at/scala - at sources - AcceptanceTest
Behavior in SBT
I can run only functional tests, for example, like this:
ft:test
Everything works as I've planned. I can even share test sources to it sources, or ft sources, or at sources... which is a common practical requirement.
Behaviour in IntelliJ
IntelliJ recognizes that src/test/scala and src/it/scala are test sources. IntelliJ does not have any distinction between them, I mean: no distinction between test sources and integration test sources, ... but it's OK. All I need is that src/it/scala is recognized as test sources. And it works as that.
However, IntelliJ does not recognize src/ft/scala as test sources; IntelliJ does not recognize src/at/scala as test sources.
I have inspected the XML file produced by sbt-structure but I was unable to understand the pattern or logic behind it. However (and apparently!) src/ft/scala and src/at/scala should appear under <configuration id="test"> in order to be eligible for being considered as test sources.
Question
In order to test my hypothesis above, I would like to force src/ft/scala to appear under <configuration id="test">, employing "something" in the build.sbt file. How could I accomplish that?

If you want a directory in IntelliJ to be considered a Test Source directory, you can configure it as such by simply right clicking it and selecting Mark Directory As > Test Sources Root
That changes your project structure configuration to let IntelliJ know that tests reside there. The same is true of your resources if those need to be marked appropriately.
With THAT said, I suppose I'm not 100% sure if IntelliJ's use of sbt will recognize that and run them appropriately, but I would expect it to.

After some experimentation, I apparently found something which works well enough. I cannot claim that I've found a solution, but at least it seems to work as I would expect: IntelliJ recognizes FunctionalTest (test) sources and AcceptanceTest (test) sources, the same way it recognizes Test sources and IntegrationTest sources.
Answer:
Create a configuration which extends Runtime and Test. Doing this way, src/ft/scala appears under <configuration id="test">, as the XML file is produced by sbt-structure. See the example below for configuration FunctionalTest:
lazy val FunctionalTest = Configuration.of("FunctionalTest", "ft") extend (Runtime, Test)
As a bonus, I show below some other settings which I find useful to be associated to configuration FunctionalTest:
lazy val forkSettings: Seq[Setting[_]] =
Seq(
fork in ThisScope := false,
parallelExecution in ThisScope := false,
)
lazy val ftSettings: Seq[Setting[_]] =
inConfig(FunctionalTest)(Defaults.testSettings ++ forkSettings ++
Seq(
unmanagedSourceDirectories in FunctionalTest ++= (sourceDirectories in Test).value,
unmanagedResourceDirectories in FunctionalTest ++= (resourceDirectories in Test).value,
))
I also did something similar for configuration AcceptanceTest.
You can find a project which exercises such kind of configurations at:
http://github.com/frgomes/template-scala-project

Related

Scala "not found: object com" - should i really add entry in build.sbt if there are no other dependencies?

I have created basic Scala Play application with https://www.playframework.com/getting-started play-scala-seed. This project compiles and runs with sbt run. But I have another Scala project that compiles and runs and which I have submitted to my local Ivy repository with command sbt publishLocal. This other project was saved at C:\Users\tomr\.ivy2\local\com.agiintelligence\scala-isabelle_2.13\master-SNAPSHOT as a result of this command.
Then I imported (exactly so - imported, no just opened) my Play project in IntelliJ and I used Project - Open Module Settings - Project Settings - Libraries to add com.agiintelligence jar from my ivy2 location. After such operations IntelliJ editor recognizes com.agiintelligence classes. That is fine.
But when I am trying to run my Play application with sbt run, I experience the error message not found: object com that is exactly when compiling import com.agiintelligence line in my Scala controller file of Play application.
Of course - such error has been reported and resolved with, e.g. object play not found in scala application
But that solution suggests to append build.sbt file. My build.sbt file is pretty bare:
name := """agiintelligence"""
organization := "com.agiintelligence"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.13.5"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test
// Adds additional packages into Twirl
//TwirlKeys.templateImports += "com.skaraintelligence.controllers._"
// Adds additional packages into conf/routes
// play.sbt.routes.RoutesKeys.routesImport += "com.skaraintelligence.binders._"
My Play application contains (as can bee seen from the IntelliJ project pane) some tens of 'external libraries' (it shows my com.agiintelligence jar as well), but why should I add my own ivy2 library in build.sbt file if no other libraries are listed here? What is different with my library? It is on my computer, in the repository as expected already?
Of course, I can try to add it build.sbt and issue sbt update and see what happens, but I can not understand this logic? Can someone explain it and provide some clue to intelligible solution of my error message?
My Play application contains (as can bee seen from the IntelliJ project pane) some tens of 'external libraries'
Those are probably just transitive dependencies of your Play dependency, that is why sbt downloaded all of them and put them in your classpath so you could use them without you needing to tell it about them; because the pom of Play already did.
It is not that the build tool or the IDE magically added all those dependencies for you because they read your mind and magically understood you wanted them. And that for some reason the magic stopped working for your own library.
Why it is not sufficient to list it Project-Setting--External Libraries in IntelliJ only?
That is sufficient for the IDE to work, but not for the build tool. The build tool is independent of the IDE; it doesn't know about it. sbt just knows about the dependencies you configured in your definition file.
Even more, you should always configure your dependencies on your build tool and then import that in the IDE; rather than the opposite. IDEs are graphical tools, so their state can not be committed, can not be shared, can not keep track of changes, can not be used in CI / CD environments; additionally, different teammates may want to use different IDEs.
I resolved the error message by adding line in build.sbt file
libraryDependencies += "de.unruh" %% "scala-isabelle" % "master-SNAPSHOT"
and by subsequent run of sbt update.
Error is solved, but the main question still stand - why I had to do this? Why there are tens of dependencies that are not listed in build.sbt and why should I list my dependency in build.sbt and why it is not sufficient to list it Project-Setting--External Libraries in IntelliJ only?
OK, comment by #Luis_Miguel_Mejía_Suárez gave the explanation, that comment is the actual and expected answer to my question.

How to make an sbt task run on under multiple scopes

I have written an sbt plugin that generates some sources and resources. It is hard coded to work in the Compile scope.
How can I make it work in the Test scope too, so I can use the plugin when running tests and it will look in and output to the correct folder?
For example, in various points in the code I refer to resourceManaged in Compile which relates to src/main/resourcesbut when test is run, I would like it to be resourceManaged in Test when relates to src/test/resources.
How do I abstract away the scope?
This is a topic discussed in Plugins Best Practices, specifically in the Configuration advices section.
Provide raw settings and configured settings
If your plugin is ObfuscatePlugin, provide baseObfuscateSettings that's not scoped in any configuration:
lazy val baseObfuscateSettings: Seq[Def.Setting[_]] = Seq(
obfuscate := Obfuscate((sources in obfuscate).value),
sources in obfuscate := sources.value
)
As you can see in the above it's accessing sources key, but it's not specified which configuration's source.
inConfig
override lazy val projectSettings = inConfig(Compile)(baseObfuscateSettings)
inConfig scopes the passed in sequence of settings into a particular configuration. If you want to support both Compile and Test out of the box, you can say:
override lazy val projectSettings =
inConfig(Compile)(baseObfuscateSettings) ++
inConfig(Test)(baseObfuscateSettings)

Real SBT Classpath at Runtime

I have some test cases that need to look at the classpath to extract the paths of some files/directories in there. This works fine in the IDE.
The problem is that, when running SBT test, Properties.javaClassPath gives me /usr/share/sbt-launcher-packaging/bin/sbt-launch.jar.
The classpath is fine when I run show test:dependency-classpath. Is there a way to obtain that information from inside the running Scala/Java program? Or is there a way to toss it into a system property or environment variable?
By default the tests are run inside of the SBT process, so the classpath will look like it did when you started sbt (I guess sbt does some trixery to dynamicly load the classes for the tests, not sure). One way to do what you want is to run your tests in a forked jvm, that way sbt will start a new jvm to run the test suite and that should have the expected class path:
fork in Test := true
I have been working on understanding how the EmbeddedCassandra works in the spark-cassandra-connector project which uses the classpath to start up and control a Cassandra instance. Here is a line from their configuration that gets the correct classpath.
(compile in IntegrationTest) <<= (compile in Test, compile in IntegrationTest) map { (_, c) => c }
The entire source can be found here: https://github.com/datastax/spark-cassandra-connector/blob/master/project/Settings.scala
Information on the <<= operator can be found here: http://www.scala-sbt.org/0.12.2/docs/Getting-Started/More-About-Settings.html#computing-a-value-based-on-other-keys-values. I'm aware that this is not the current version of sbt, but the definition still holds.

How to prevent SBT from recompiling modified .class files?

In our project, we have a enhance post-process to the .class files generated by compile. This enhance step actually modifies the generated .class files then overrides it.
enhance <<= enhance triggeredBy (compile in Compile)
The problem is that sbt has a mechanism called incremental recompilation. It monitors the generated .class file. Every time the enhancer overrides the generated .class file, sbt recognises these modifications and recompiles related sources in next compile command.
To us, a recompilation is a very time-consuming work. We want to stop sbt from recompiling modified .class file. That may mean making sbt only monitor source changes, not output changes.
I did some searching on this. But I found little things about this. Now I know a trait called Analysis is likely responsible for the mapping from sources to output .class files. So I ask help from you guys.
Ps: we may solve this problem by putting the output of enhance to another folder, but it is not preferred.
sbt strongly discourages mutations to files. You should generate different files instead. By doing so, you will solve your problem, since the incremental compiler of sbt will still be looking at the unaltered .class files. You will have some rewiring to do:
Send the outputs of the compile task somewhere else:
classDirectory in Compile := crossTarget.value / "classes-orig"
Processing these .class files with your tool, and send them to crossTarget.value / "classes" (the original classDirectory:
enhance <<= enhance triggeredBy (compile in Compile)
enhance := {
val fromDir := (classesDirectory in Compile).value
val toDir := crossTarget.value / "classes"
...
}
Rewire productDirectories to use crossTarget.value / "classes" anyway (otherwise it'll go looking in your modified classDirectory:
productDirectories in Compile := Seq(crossTarget.value / "classes")
Make sure that products depends on your enhance task:
products in Compile <<= (products in Compile) dependsOn enhance
You may need some more rewiring if you have resources (see copyResources). But basically you should be able to get there.
I said about that sbt monitors the output .class file. When a .class file is modified, it recompiles the .class file's source.
After some research, we found that sbt notices file's modification by its last modified time. That is to say, we can fool sbt by rolling back the last modified time after the modification, so that sbt won't notice any changes.
So, our solution is simple but effective:
find all .class files
note down their last modified time
do the enhance
put back the former last modified time
This is a small trick. We still expect more robust solutions.
Description:
A little like Chenyu, I had to write a plugin for SBT 1.x, that would enhance compiled classes and later I wanted to make sure that those enhanced classes were used for building the jar.
I did not want to hack this solution, so Chenyu's answer was not acceptable to me and sjrd's answer was very helpful but adjusted to SBT 0.13.
So here is my working solution, with little comments:
Code:
object autoImport {
val enhancedDest = settingKey[File]("Output dir for enhanced sources")
}
def enhanceTask: Def.Initialize[Task[Unit]] = Def.task {
val inputDir = (classDirectory in Compile).value
val outputDir = enhancedDest.value
enhance(inputDir, outputDir)
...
}
override def projectSettings: Seq[Def.Setting[_]] = Seq(
enhancedDest := crossTarget.value / "classes-enhanced",
products in Compile := Seq(enhancedDest.value), // mark enhanced folder to use for packaging
// https://www.scala-sbt.org/1.0/docs/Howto-Dynamic-Task.html#build.sbt+v2
compile in Compile := Def.taskDyn {
val c = (compile in Compile).value // compile 1st.
Def.task {
(copyResources in Compile).value // copy resources before enhance
enhanceTask.value // enhance
c
}
}.value
)

What is the issue with this sbt file?

When I import a SBT project into intelliJ, the build.sbt file is showing lot of errors as shown in the following screenshot. Wondering what might be the issue
IDEA Version 13.1.4
I also see the following
The following source roots are outside of the corresponding base directories:
C:\Users\p0c\Downloads\recfun\src\main\resources
C:\Users\p0c\Downloads\recfun\src\test\java
C:\Users\p0c\Downloads\recfun\src\test\scala
C:\Users\p0c\Downloads\recfun\src\test\resources
These source roots cannot be included in the IDEA project model. Please consider using shared SBT projects instead of shared source roots.
I think the question perhaps does not provide all the required information to answer conclusively, but I'll give it a spin anyways -
Since sbt runs correctly when invoked from the shell, we know the sbt file is fine. I use Idea for my Scala and sbt projects and I am sure the Idea sbt support works very well, but! Idea is far more restrictive than sbt when it comes to project structure. It is really easy to create a valid sbt project structure that Idea can't support very well.
Given that the source roots error indicates that the recfun/src folder is not in the project folder, it seems clear that this error is not emitted during the processing of recfun/build.sbt. The screenshot shows you have at least three different sbt files, progfun-recfun, submission and scala-recfun. Since Idea will also create projects like submission-build, and you have an assignment-build project there, something is probably broken in the project structure, not from the sbt viewpoint - there you're fine, you can build - but from the Idea viewpoint, which is more restrictive.
My suggestion to resolve this would be to change your project structure as follows. First, have a top level project with a build.sbt. Then create a sub-project for each src folder you want. Do not put a src folder in your top level project. Each sub-project needs a build.sbt as well.
Second, make sure the sub-projects build correctly with sbt when run from the shell. Arrange the sub-project build.sbt files with the proper dependencies, using this syntax:
lazy val a = ProjectRef(file("../a"), "a")
lazy val b = ProjectRef(file("../b"), "b")
lazy val root = Project(id = "c", base = file(".")) dependsOn (a, b)
(This example has three sister projects a, b and c, where c depends on a and b. The three projects are placed in directories with the same name, within the root directory. The code snippet is from the build file for c.)
Third, arrange your top level build.sbt to aggregate the sub-projects, using this syntax in the top level build.sbt:
lazy val a = ProjectRef(file("a"), "a")
lazy val b = ProjectRef(file("b"), "b")
lazy val c = ProjectRef(file("c"), "c")
lazy val root = (project in file(".")).
aggregate(a, b, c)
Building this top level project will build each of the sub-projects a, b and c, and the dependencies established in the sub-project build files will ensure they are built in the right order.
Fourth, import the top level project into Idea, and all should be well.
You can get fancy with the file structure if you want, because the project references use relative paths, but it's usually nice to at least start simple.
I had a lot of frustration with sbt and Idea at the start, I hope this helps :)