I would like to know whats the easiest way to plug JaCoCo4sbt's data into Sonar,
In Jenkins I have installed Sonar & JaCoCo's plugins. I have also installed JaCoCo's plugin in Sonar.
My sonar-project.properties file contains :
sonar.jacoco.reportPath=target/jacoco/jacoco.exec
And Jenkins's job execute these commands :
sbt jacoco:cover
/opt/sonar-runner/bin/sonar-runner
SBT_OPTS="-Dsbt.log.noformat=true"
sbt clean update compile test doc
For now I don't get any code coverage data into Sonar
Do you want to report code coverage on Scala code using the Scala plugin for Sonar (http://docs.codehaus.org/display/SONAR/Scala+Plugin)?
Unfortunately it does not yet provide a sensor for code coverage.
It's on the roadmap for future versions.
At least jacoco4sbt successfully generates the file jacoco.exec but it is just not picket up by the Scala plugin.
You'll need the following properties:
sonar.dynamicAnalysis=reuseReports
sonar.java.coveragePlugin=jacoco
sonar.jacoco.reportPath=${build.dir}/jacoco.exec
I don't use sbt, but the following is an ANT example:
Add ant plugins dynamically at buildtime?
Check the properties file at the end for all the Sonar related stuff.
Related
I have an sbt project that I have imported into Intellij. Sometimes I build the project at the command line using sbt, and then when I need to debug I build it from within Intellij. However, each time I alternate it requires a full rebuild when there is no need. Both build procedures output to the same class folder, namely .../target/scala-2.11/classes, so I don't understand why a full rebuild keeps happening?
As stated by CrazyCoder, intellij and sbt build have each their own tracking of changed files for incremental build. Thus each time one re-compile a file, the other treats it as a changed file and recompile it too.
While CrazyCoder's answer describes how to make them work on separated directory, by changing the sbt compiled classes dir. This answer explain how you can configure Intellij to use sbt for all build, thus only sbt does the compilation. This is a relatively new feature.
Just check the option:
file
> Settings
> Build, Execution, Deployment
> Build Tools
> SBT
> Use SBT shell for build and import
It works at least since intellij version 2017.2.3, and most probably it is an option from the SBT plugin.
For details about this feature, see jetbrains ticket: https://youtrack.jetbrains.com/issue/SCL-10984
IntelliJ IDEA cannot reuse the classes produced by the other build systems because it has its own incremental compiler which tracks the dependencies and builds the caches during the compilation so that it can compile only modified and dependent files when you make a change in the code. When you built with SBT/Maven/Gradle or command line javac, IntelliJ IDEA compiler cache doesn't know about what has changed and which files it should compile, therefore it performs the full rebuild.
A solution would be to use different output directories for IDE and SBT, this way IntelliJ IDEA will rebuild only files modified since the last build in the IDE and your command line SBT build will not trigger a rebuild in the IDE.
This configuration is performed using the sbt-ide-settings plug-in.
Add the following into plugins.sbt (or whatever files you configure the plugins in):
resolvers += Resolver.url("jetbrains-bintray",url("http://dl.bintray.com/jetbrains/sbt-plugins/"))(Resolver.ivyStylePatterns)
addSbtPlugin("org.jetbrains" % "sbt-ide-settings" % "0.1.2")
And here is how to customize the IDE output directory in build.sbt:
ideOutputDirectory in Compile := Some(new File("target/idea/classes"))
ideOutputDirectory in Test := Some(new File("target/idea/test-classes"))
Feel free to change the paths according to your needs.
I have been using sbt on windows and a custom build.sbt script in conjunction with an import Chisel._ in the top-level file in order to generate Verilog from my Chisel source successfully.
I'm trying to get an IDE working on Windows to expedite Chisel development. I've gone with the Eclipse based SCALA IDE http://scala-ide.org/download/sdk.html/
I want to compile the Chisel library so that the import Chisel._ can be resolved locally, without having to go off and download the source from the repository each timeand recompile the source. When I download the Chisel-master repo from Git and include the src\main folder in my SCALA project in the SCALA IDE, I get lots of syntax errors in the Chisel SCALA files that prevent me from building the project.
Has anyone done anything like this before on Windows or have any knowledge of working with the SCALA IDE as it may just be a case of undefined symbols in the project configuration?
Not sure exactly what you did with build.sbt respect to recompile (I think it download it only the first time, then it caches it for the future). But I'm using ScalaIDE for Chisel on linux, using the default build.sbt files, maybe you can try to get it working out of the box first to help narrow down the issue.
Here are the steps I took in order to get ScalaIDE work with Chisel:
the latest Scala IDE uses 2.11.8, the current Chisel repository defaults to 2.11.7. So I had to change all the build.sbt reference to scalaVersion from 2.11.7 to 2.11.8
I used sbteclipse
https://github.com/typesafehub/sbteclipse
To create importable the workspace to setup the compilation dependencies.
Except for chiselFrontEnd. For some reason, this package is not added to the dependency. I have to Add chiselFrontEnd as a javabuildpath dependency manually (Properties/JavaBuildPath, under Projects) for my own projects.
To resolve undefined symbols, you can also add a JAR onto the project build path using Project Properties > Java Build Path > Libraries > Add External JARs...
If you are getting your JARs through Maven / SBT, they should be in:
C:\Users\<name>\.ivy2\local\edu.berkeley.cs\chisel3_2.11\jars
If you are using publish-local with chisel3, your JARs should be in
C:\Users\<name>\.ivy2\cache\edu.berkeley.cs\chisel3_2.11\jars
Note that chisel3 is compiled into one JAR, including coreMacros and chiselFrontend sub-projects
Of course, this is a more quick-and-dirty solution compared to something that can parse SBT files.
I have Play project, I am trying to publish the coverage result with scoverage plugin and sbt. Everything works locally but when I try to run the same commands with jenkins it shows following error.
My Jenkins Configuration for sbt is like this:
Also the configuration of build.sbt and plugins.sbt
Are you using the instructions from the github repo ?
sbt clean coverage test
and
sbt coverageReport
should do the work. It seems you are missing a camel case on coveragereport.
you can also have a look at the example they provide
also, is it an open source project we can have a look at? Or just paste the build.sbt and project/plugins.sbt files
We are creating a customize version of Spark since we are changing some lines of code from ALS.scala. We build the customize spark version using
mvn command:
./make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn
However, upon using the customized version of Spark, we run into this error:
Do you guys have some idea on what causes the error and how we might solve the issue?
I am actually using a jar file in the local machine by building them using sbt: sbt compile then sbt clean package and putting the jar file here: /Users/user/local/kernel/kernel-0.1.5-SNAPSHOT/lib.
However in the hadoop environment, the installation is different. Thus, I use maven to build spark and that's where the error comes in. I am thinking that this error might be dependent on using maven to build spark as there are some reports like this:
https://issues.apache.org/jira/browse/SPARK-2075
or probably on building spark assembly files
I'm writing a Gradle plugin to generate Java code from WSDL. Problem is, my task does not find the Java class I'm trying to execute and blows up at runtime with a ClassNotFoundException even though the necessary jar is listed as a compile dependency. I'm using project.sourceSets.main.runtimeClasspath but have tried compileClasspath, adding a build script section to the build file, using configurations.runtime, all to no avail. Note that my project has no Java src code, just Groovy.
Any ideas? The task, a unit test and the build file can be found here:
https://gist.github.com/abhijitsarkar/8432347
c.f.: cross posted on the Gradle forum
It turns out that because my plugin uses project.sourceSets.main.runtimeClasspath, the client needs to declare the dependencies with runtime scope. It is not enough to have the dependencies declared in the plugin project only.