exclude play route file -generated code from WartRemover - scala

I am using WartRemover in a play project. I want to exclude routes file (generated code from it) from Wartremover scanning. I added following but it still scans play routes generated code.
wartremoverExcluded ++= Seq("com.xxx.controllers.ReverseMyController","com.xxx.controllers.javascript.ReverseMyController","com.xxx.controllers.ref.ReverseMyController")
And it still shows wart errors from the generated code for routes play file. for e.g.
warn] /xxx/conf/routes:23: Inferred type containing Nothing
warn] PUT /service/myendpoint com.xxx.controllers.MyController.postMyData
and same for many more routes defined in the routes file.
How to exclude routes from wartremover scan?

Have you tried putting -Xprint:typer in scalacOptions to see which package is the issue. It seems to work for me when I ignore the following
wartremoverExcluded ++= Seq("Routes", "controllers.ref")

It looks like this question was asked in the context of wartremover 0.11, but if anyone finds themselves here looking for a solution for 0.12, this works for me:
wartremoverExcluded += sourceManaged.value / "main" / "routes_reverseRouting.scala"
wartremoverExcluded += sourceManaged.value / "main" / "routes_routing.scala"

Related

Set file-level option to scalapb project

I'm using ScalaPB (version 0.11.1) and plugin sbt-protoc (version 1.0.3) to try to compile an old project with ProtocolBuffers in Scala 2.12. Reading the documentation, I want to set the file property preserve_unknown_fields to false. But my question is, where? Where do I need to set this flag? On the .proto file?
I've also tried to include the flag as a package-scoped option by creating a package.proto file next to my other .proto file, with the following content (as it is specified here):
import "scalapb/scalapb.proto";
package eur.astrata.eu.bigdata.tpms.protobuf;
option (scalapb.options) = {
preserve_unknown_fields: false
};
But when trying to compile, I get the following error:
[libprotobuf WARNING T:\src\github\protobuf\src\google\protobuf\compiler\parser.cc:648] No syntax specified for the proto file: package.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
scalapb/scalapb.proto: File not found.
package.proto:1:1: Import "scalapb/scalapb.proto" was not found or had errors.
I've also tried with syntax = "proto3"; at the beginning but it doesn't work.
Any help would be greatly appreciated.
From the docs:
If you are using sbt-protoc and importing protos like
scalapb/scalapb.proto, or common protocol buffers like
google/protobuf/wrappers.proto:
Add the following to your build.sbt:
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf"
This tells sbt-protoc to extract protos from this jar (and all its dependencies,
which includes Google's common protos), and make them available in the
include path that is passed to protoc.
It is important to add that by setting preserve_unknown_fields to false you are turning off a protobuf feature that could prevent data loss when different parts of a distributed system are not running the same version of the schema.

Generate file descriptor set (.desc) with scalapb

I am using scalapb in a project that needs to have access to the FileDescriptorSet. Is there a way to have scalapb generate the .desc file in addition to all other class files? Or is there some programatic way of obtaining a FileDescriptorSet from what is already generated?
Yes, to both questions.
If you are using sbt-protoc, you can have the following definition in your SBT file:
PB.protocOptions in Compile := Seq(
"--descriptor_set_out=" +
(baseDirectory in Compile).value.getParentFile / "src" / "main" / "resources" /"out.desc"
)
One caveat is that you would have to create src/main/resources yourself, otherwise you would get an error. It would probably be better to generate into resourceManaged (that would also require creating a directory ahead of time, since protoc doesn't do that)
You can also build a FileDescriptorSet at run time. For each proto file, ScalaPB generates a Scala object with scalaDescriptor (and also javaDescriptor if that's more convenient). The descriptors contains a list of their dependencies which are also FileDesciptors - you can traverse that tree structure and build a list of FileDescriptors which is essentially a FileDescriptorSet.

How to access library dependencies at build-time in SBT/Scala build?

For example, suppose I have a file project/CodeGeneration.scala that generates "managed" source-code, and suppose that object (CodeGeneration) needs to leverage a third-party library -- say jsoup...
import org.jsoup._
object CodeGeneration {
def generateCode = /* Generate code using jsoup... */
}
Simply adding a line for jsoup to your libraryDependencies in build.sbt doesn't do the trick; it leads to a compilation error complaining about the missing jsoup object/namespace.
So, (how) can one access this dependency from "meta" code -- code that generates other code?
It seems the solution is to leverage sbt's "recursive" nature, and put an additional build.sbt file in the project directory. So, for example, project/build.sbt might look like this:
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.2"
There's more detail in sbt's official documentation.

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

Scala SBT CoffeeScripted, correctly override compile target

Groping in the dark, I just resorted to a pathetic hack (note the path backtracking):
(resourceManaged in (Compile, CoffeeKeys.coffee)) <<=
(crossTarget in Compile)(_ / "../../../apache/static" / "js")
Is there any way to specify the absolute target write path with coffeescripted-sbt? The intro/overview states
You can override this behavior by overriding the resourceManaged
setting scoped to your configration and the CoffeeKeys.coffee task.
Below is an example you can append to your build definition which will
copy generated javascript to target/:scala-version/your_preference/js
That's great, but I'd like to write directly to apache statics directory, and not 4 levels deep in my sbt-eclipse project
Should note: I'm getting the Unicorn is Angry quite often on GitHub these days, so issue tracker isn't much help.
Thanks for any clues, what I have works, but I'd like to know how to set the absolute path properly
(resourceManaged in (Compile, CoffeeKeys.coffee)) <<=
(crossTarget in Compile)(_ / "pref" / "js")
Sets compile target relative to the default, which is "project_root/target/scala-version/"
The solution is refreshingly simple:
resourceManaged in (Compile, CoffeeKeys.coffee) :=
file("/absolute/path/to/apache/static/js")
SBT user group thread