Jacoco is analysing class twice and failing - scala

I currently have this configuration for JaCoCo in my pom.xml:
<plugin>
<!-- Maven JaCoCo Plugin configuration for Code Coverage Report -->
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<inherited>true</inherited>
<configuration>
<excludes>
<exclude>**/*Test.class</exclude>
</excludes>
</configuration>
When I run mvn clean verify site, I get a failed build, based in this warning (and important thing: I have only 1 class, SayHello.scala):
Analyzed bundle 'dummy-project' with 2 classes
[WARNING] Rule violated for bundle dummy-project: classes missed count is 1, but expected maximum is 0
Failed to execute goal org.jacoco:jacoco-maven-plugin:0.8.5:check (default-check) on project dummy-project: Coverage checks have not been met. See log for details.
And finally, when I check the report, it's analysing the same class (the only difference is the extra "." in the second line), failing in one of them:
Update
SayHello.scala
package com.dummy
object SayHello {
def sayIt: String = "hello, world"
}
SayHelloTest.scala
package com.dummy
import org.scalatest.funsuite.AnyFunSuite
import org.scalatest.matchers.should.Matchers
class SayHelloTest extends AnyFunSuite with Matchers {
test("SayHello says hello") {
SayHello.sayIt shouldBe "hello, world"
}
}
Anyone had a similar issue? Thank you.

JaCoCo analyzes .class files, not source files. The Scala compiler may produce multiple .class files from a single source file. Your SayHello.scala class most likely contains a companion object. An object is always compiled to a class of the same name with $ at the end, which implements the companion object singleton at the bytecode level. If you go to your target/classes directory, you'll most likely see those two files there - SayHello.class and SayHello$.class.
Two records in the JaCoCo report correspond to those two class files. The dot at the end instead of a $ is most likely a jacoco report rendering issue.
To skip the companion object class from analyzing, just add it to your exclusion list:
<configuration>
<excludes>
<exclude>**/*Test.class</exclude>
<exclude>**/*$.class</exclude>
</excludes>
</configuration>
However, it seems that coverage of the methods in the companion object is attributed to SayHello$.class, not SayHello.class, so by removing the $ class from the list, you essentially lose the coverage.
I'm not aware of any workarounds for the Maven+JaCoCo setup, apparently the jacoco-maven-plugin is not really Scala-aware. However, the sbt plugin appears to have worked around these issues, see e.g. https://blog.developer.atlassian.com/using-jacoco-a-code-coverage-tool-for-scala/
If switching to sbt is not an option, you could take a look at some Scala-specific code coverage tools, like scoverage, which also has a maven plugin.

Related

Duplicated and superposed class hierarchy seamlessly builds

So I've got this Scala + multi-module Maven project hierarchy:
- pom.xml
- nested1
| - pom.xml
| - src
| - main
| - scala
| - ...
| - MyClass.scala
| - ...
- nested2
| - pom.xml
| - src
| - main
| - scala
| - ...
| - MyClass.scala
| - ...
- app
| - pom.xml
| - Main.scala
Basically, the projects nested1 and nested2 have exactly the same class hierarchy: they declare the same classes, the same traits, all of them have the same content, etc.
app/pom.xml has these dependencies:
<dependency>
<groupId>${project-package}</groupId>
<artifactId>nested1</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>${project-package}</groupId>
<artifactId>nested2</artifactId>
<version>1.0</version>
</dependency>
The Main class actually imports MyClass at line 1, but (I would guess) it has no way to tell which version to take: both nested1/src/main/scala/${project-package}/MyClass.scala and nested2/src/main/scala/${project-package}/MyClass.scala have the same ${project-package}.
I actually tried this scenario, and maven seems to choose at random either of the two classes without even issuing a single warning or error.
What's happening behind the scenes? Why am I not getting an error like "ambiguous import statement: MyClass at line 1 in Main"?
You don't see errors like you describe because this situation is explicitly not supported by the JVM. It is not even related to Scala directly: it's just on the JVM level, there is no such thing as "libraries", there is only flat classpath (until JPMS in any case), where class names are supposed to be unique.
In general, such situation should never happen - you just can't have different classes having the same fully-qualified name (package + class name, basically) within one class loader (or, more often, within a single branch of the tree of classloaders). If you do, what happens is undefined. This is similar to the concept of "undefined behavior" in C/C++: the runtime just assumes that there can only be one class for the given name, and is free to behave based on this assumption; depending on classloaders configurations, you can get a random result, some fixed result, an error during classloading, or any combination of these. Things become even more funny when you have dependencies which in their turn depend on different versions of the same class, resulting in a whole host of potential runtime exceptions.
This is a part of a very well-known problem of the JVM world, so-called classpath/JAR hell. Basically, if your project is complex enough that it has transitive or direct dependencies on different versions of the same library, or, more specifically, the same set of class names, you will suffer. The amount of suffering depends on the complexity of your situation: in some cases, ensuring that you only have one version of a library in classpath is enough (which might require some tweaks in the build configuration); in other cases, you'll have to perform shading (which is exactly a way to solve the problem which you encountered) for certain subset of your dependencies. In even more difficult cases, shading won't work, and you'll have to rethink your architecture. Depending on your environment, you might need to use tools like OSGi or indeed the new Java Platform Module System to solve classpath issues.
Note that with JPMS, this particular situation becomes a bit better: Java modules by design cannot have the same package defined in a multiple modules loaded by the same JVM, so if you compile your projects as proper modules, and try to use them in the third project, you'll get an exception during startup about conflicting modules. That being said, I don't have much experience with JPMS so I can't say how exactly it will work, especially with Scala in the mix.

GWT module xml source element to specify single class

I have a GWT application (FooGwtApp) and a library module (FooLib) used as a dependency in FooGwtApp. The package structure of FooLib looks like this:
packageFoo.ImportantClass
packageFoo.UnimportantClass
packageBar.OtherClass
I want ImportantClass (and only ImportantClass) to be compiled to JS by the GWT compiler. Moving ImportantClass to another package is not an option.
I created ImportantClass.gwt.xml within packageFoo with the following content:
<module>
<inherits name="com.google.gwt.user.User"/>
<source path="" includes="**/ImportantClass*"/>
</module>
Next I put an inherited reference to the ImportantClass module definition in FooGwtApp.gwt.xml (this seems to work: the IDE recognizes it, and is able to parse the reference to ImportantClass.gwt.xml).
Now If I put references to ImportantClass into FooGwtApp's client code, the GWT compiler fails, because it does not find ImportantClass on the source path:
No source code is available for type packageFoo.ImportantClass; did you forget to inherit a required module?
I likely messed up sommething in the source path / includes attribute in ImportantClass.gwt.xml - either defining the current package as root package with path="" is not a valid notation or something's wrong with the includes attribute. Or both. Or neither.
Can you give me a clue about where it all went wrong?
It turns out the problem was not in ImportantClass.gwt.xml, but in other Maven related stuff:
ImportantClass.gwt.xml should be placed under src/main/resources/packageFoo, not src/main/java/packageFoo, otherwise it won't be packaged into the binary jar.
GWT compiler compiles from Java source to Javascript source. This means we don't just need ImportantClass.class in FooLib.jar, but also its source. Best solution for this is to use maven-source-plugin in FooLib's pom.xml and also to import the FooLib dependency into FooGwtApp with sources classifier.
On the latter topic, see the following SO answers:
Maven: Distribute source code with with jar-with-dependencies
How to properly include Java sources in Maven?
After fixing the above problems, the source path declaration present in the question works.

antlr4 Jar has duplicate classes in different packages - don't know which are referred to by internal code

I am using antlr4.3 (complete) jar.
It has many duplicates in org.antlr.runtime and org.antlr.v4.runtime packages.
In code when I explicitly use 'v4.runtime' - at runtime, classpath picks up 'runtime'.
So I extracted the jar and recreated it without org.antlr.runtime.
But apparently some classes like RecognitionException is now not found.
How should I resolve this other than:
Exploding the latest Jar and specifying org.antlr.v4.runtime BEFORE org.antlr.runtime so that a duplicate class will be picked up from v4.runtime, and if there isn't one in it, it will look at org.antlr.runtime...??
To add to the above, here's the code snippet which gives a problem: the jars are in the classpath.
import org.antlr.v4.runtime.CharStream;
import org.antlr.v4.runtime.ANTLRInputStream;
public class AntlrMain {
public static void main(String[] args) {
System.out.println("Start Hello World");
try {
InputStream is = new FileInputStream(
"/home/ecworkspace/antlrCentral/DSL/mydev.dsl");
org.antlr.runANTLRInputStream input = new ANTLRInputStream(is);
org.antlr.v4.runtime.CharStream cs = (org.antlr.v4.runtime.CharStream) input;
VCFGLexer lexer = new VCFGLexer(cs);
Initially in the ANtlrMain class, I wasn't using explicit
org.antlr.v4.runtime.; but that failed at runtime, with 'CharStream not found'.
Then I changed to include full path of the class
Then changed the ANTLR4 Jar to 'exclude' org.antlr.runtime (it has org.antlr.v4.runtime). That's when the 'RecognitionException not found' error occurred.
The grammar by the way, compiles OK, generating all my VCFG*.java and tokens classes, where VCFG is the grammar name.
UPDATE 1
Keeping in line with suggestions from all - I removed my answer to my own questions and adding it to this original questions.
In antlr-4.2-complete.jar, I see:
/tmp/x/ $ jar -xf antlr-4.2-complete.jar
/tmp/x/ $ ls org/antlr
runtime stringtemplate v4
/tmp/x/ $ ls org/antlr/v4
analysis codegen parse semantics Tool$1UndefChecker.class Tool$OptionArgType.class
automata misc runtime tool Tool.class Tool$Option.class
/tmp/x/ $ ## The 2 runtimes above: org.antlr.runtime and org.antlr.v4.runtime
/tmp/x/ $ ## which to use where, along with same-name classes in
/tmp/x/ $ ## org.antlr and org.antlr.v4
So, in build.xml, I use above jar to:
`
java -jar antlr-4.2-complete grammar.g4 => compiles and gives me
VCFG*.java and VCFG*.tokens
javac -cp "antlr-4.2-complete-jar" VCFG*.java => Succeeds. I have
the VCFG*.class collection.
Then I compile my code AntlrMain.java (which uses AntlrInputStream
etc.), again with the above antlr jar and some 3rd-party Jars
(logging, commons) => successfully.
Finally the RUN of java -cp "antlr-4.2-complete.jar:log4j.jar" -jar
myJar => FAILS on 'CharStream' not found.
UPDATE 2
Adding, based on your response.
I have only recently started posting questions on Stackoverflow. So pointers about whether to respond to my question to provide more info, or to comment to a reply etc. are welcome.
-cp <3rd-party> is -cp "log4j.jar:commonsLang.jar".
By -cp "above-jar" I meant -cp "antlr-4.2-complete.jar.
And if I have not mentioned it, it is an oversight - I have, for every 'java' and 'javac commands, included antlr-4.2-complete.jar.
BUT I see you indicating antlr-runtime-4.2.jar. So there ARE separate antlr-runtime jar and antlr-complete jars.
In the 4 steps below (I am leaving out -cp for convenience, but am including antlr-4.2-complete.jar for 'every' step.
I believe, I should be using the antlr-run-time and antlr-complete jars at different steps:
1 (java MyGrammar.java)
2 (javac MyGrammar*.java)
3. javac MyOwnCode.java
4. Run myCode (java MyCode) ...
which of the two antlr JARs (runtime and complete; and their versions) should I then use, at each of the above 4 steps?
The jar file does not contain duplicate classes. The code generation portion of the ANTLR 4.3 Tool relies on the ANTLR 3.5.2 Runtime library, which is included in the "complete" jar file. While some of the classes have the same name as classes in ANTLR 4, they are not duplicates and cannot be used interchangeably.
#280Z28 / Sam:
I am mortified, but have to admit the simplest answer is most often the correct.
I spent time fleshing out the JAR, making multiple JAR files out of it, include one for compile, one for run and on and on.
The answer is succinctly explain in the ANT build.xml code snippet below: where I produce the 'final' production JAR file, which is the only JAR then included while executing my Main program:
<jar destfile="${p_productionJar}">
<fileset dir="${p_buildDir}" casesensitive="yes">
<include name="**/*.class"/>
</fileset>
<zipfileset includes="**/*.class" src="${p_genCodeJar}"/>
<!-- How did I miss including p_antlrJar earlier?? -->
<zipfileset includes="**/*.class" src="${p_antlrJar}"/>
<zipfileset includes="**/*.class" src="${p_jschJar}"/>
<zipfileset includes="**/*.class" src="${p_log4jJar}"/>
<zipfileset includes="**/*.class" src="${p_commonslangJar}"/>
<manifest>
<attribute name="Main-Class" value="AntlrMain"/>
.....
The production Jar was missing ${p_antlrJar} => which is antlr-4.3-complete.jar!!!!
You did mention this in your answer... but it was a pretty silly mistake to do, and didn't think I had done it...
Thank you.

Intellij Idea 10.5 and Maven+GWT - Cannot resolve directory

I created maven project with command line:
mvn archetype:generate
-DarchetypeRepository=repo1.maven.org
-DarchetypeGroupId=org.codehaus.mojo
-DarchetypeArtifactId=gwt-maven-plugin
-DarchetypeVersion=2.3.0-1
Then I opened it in Intellij Idea 10.5, and received some errors from Maven Model Inspector:
Cannot resolve directory ''${webappDirectory}'' (at line 59)
Cannot resolve directory 'WEB-INF' (at line 59)
Cannot resolve file 'classes' (at line 59)
In spite of this the project compile and run normally. What is this?
Part of my pom.xml:
<build>
<!-- Generate compiled stuff in the folder used for developing mode -->
<outputDirectory>${webappDirectory}/WEB-INF/classes</outputDirectory>
. . .
</build>
in the beginning of your pom.xml file in the properties section there should be this tag:
<webappDirectory>${project.build.directory}/${project.build.finalName}</webappDirectory>
That's where the ${webappDirectory} "variable" is defined.
When you place the cursor over the text inside the brackets and press ctrl + q, you will see a small explanation for the variables (Intellj Idea specific)
${project.build.directory} = Model Property project.build.directory: ${project.basedir}/target
${project.build.finalName} = Model Property project.build.finalName: ${project.artifactId}-${project.version}
<outputDirectory>...</outputDirectory> configures where the compiled classes are put (As the comment above your snippet already says).
Have a look into your projectDir/target directory.
But I can't tell you what this means for development mode.
Btw. i've put <!--suppress MavenModelInspection --> above <outputDirectory> to get rid of the red mark in the editor.
Whoops, that seems to be a pretty old question :)
Guess you already know!
Regards

FSC recompiles every time

FSC recompiles my .scala files every time even there is no need - I can compile it twice without editing anything between attempts and it recompiles them!
For example, I have 2 files
Hello.scala
class Hello{
print("hello")
}
And Tokens.scala:
abstract class Token(val str: String, val start: Int, val end: Int)
{override def toString = getClass.getSimpleName + "(" + "[" + start + "-" + end + "]" + str + ")"}
class InputToken(str: String, start: Int, end: Int)
extends Token(str, start, end)
class ParsedToken(str: String, start: Int, end: Int, val invisible: Boolean)
extends Token(str, start, end)
When I ask ant to compile project from scratch I see following output:
ant compile
init:
[mkdir] Created dir: D:\projects\Test\build\classes
[mkdir] Created dir: D:\projects\Test\build\test\classes
compile:
[fsc] Base directory is `D:\projects\Test`
[fsc] Compiling source files: somepackage\Hello.scala, somepackage\Tokens.scala to D:\projects\Test\build\classes
BUILD SUCCESSFUL
Than I don't edit anything and ask ant compile again:
ant compile
init:
[mkdir] Created dir: D:\projects\Test\build\classes
[mkdir] Created dir: D:\projects\Test\build\test\classes
compile:
[fsc] Base directory is `D:\projects\Test`
[fsc] Compiling source files: somepackage\Tokens.scala to D:\projects\Test\build\classes
BUILD SUCCESSFUL
As you can see, fsc acts smart in case of Hello.scala (no recompilation) and acts dumb in case of Tokens.scala. I suggest that the problem is somehow related with inheritance but that is all.
So what is wrong?
Tokens.scala is recompiled because there isn't a class file matching its basename. That is, it doesn't produce a Tokens.class file. When deciding if a source file should be compiled, fsc looks for a classfile with the same basename and if the class file does not exist or the modification time on the source file is later than that of the class file, the source file will be rebuilt. If you can, I suggest that you look into Simple Build Tool, its continuous compile mode accurately tracks source->classfile mapping and won't recompile Tokens.scala
For extra laughs, think about what the compiler might do if you have a different source file that has class Tokens in it.
Although scala allows arbitrary public classes/objects in any source file, there's still quite a bit of tooling that assumes you will somewhat follow the java convention and at least have one class/object in the file with the same name as the source file basename.
I don't like much posting stuff written by others, but I think this question merits a more complete answer that what was strictly asked.
So, first of all, fsc recompiles everything by default, period. It is ant, not fsc, which is leaving Hello.scala out, because the file name matches the class name. It is not leaving Tokens.scala out because there is no class called Tokens compiled -- so, in the absence of a Tokens.class, it recompiled Tokens.scala.
That is the wrong thing to do with Scala. Scala differs in one fundamental aspect from Java in that, because of technical limitations on JVM, a change in a trait requires recompilation of every class, object or instantiation that uses it.
Now, one can fix the ant task to do a smarter thing starting with Scala 2.8. I'm taking this information from blogtrader.net by Caoyuan, of Scala plugin for Netbeans fame. You define the Scala task on the build target like below:
<scalac srcdir="${src.dir}"
destdir="${build.classes.dir}"
classpathref="build.classpath"
force="yes"
addparams="-make:transitive -dependencyfile ${build.dir}/.scala_dependencies"
>
<src path="${basedir}/src1"/>
<!--include name="compile/**/*.scala"/-->
<!--exclude name="forget/**/*.scala"/-->
</scalac>
It tells ant to recompile everything, as ant simply isn't smart enough to figure out what needs to be recompiled or not. It also tells Scala to build a file containing compilation dependencies, and use a transitive dependency algorithm to figure out what needs to be recompiled or not.
You also need to change the init target to include the build directory in the build classpath, as Scala will need that to recompile other classes. It should look like this:
<path id="build.classpath">
<pathelement location="${scala-library.jar}"/>
<pathelement location="${scala-compiler.jar}"/>
<pathelement location="${build.classes.dir}"/>
</path>
For more details, please refer to Caoyuan's blog.