I'm using ScalaPB (version 0.11.1) and plugin sbt-protoc (version 1.0.3) to try to compile an old project with ProtocolBuffers in Scala 2.12. Reading the documentation, I want to set the file property preserve_unknown_fields to false. But my question is, where? Where do I need to set this flag? On the .proto file?
I've also tried to include the flag as a package-scoped option by creating a package.proto file next to my other .proto file, with the following content (as it is specified here):
import "scalapb/scalapb.proto";
package eur.astrata.eu.bigdata.tpms.protobuf;
option (scalapb.options) = {
preserve_unknown_fields: false
};
But when trying to compile, I get the following error:
[libprotobuf WARNING T:\src\github\protobuf\src\google\protobuf\compiler\parser.cc:648] No syntax specified for the proto file: package.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
scalapb/scalapb.proto: File not found.
package.proto:1:1: Import "scalapb/scalapb.proto" was not found or had errors.
I've also tried with syntax = "proto3"; at the beginning but it doesn't work.
Any help would be greatly appreciated.
From the docs:
If you are using sbt-protoc and importing protos like
scalapb/scalapb.proto, or common protocol buffers like
google/protobuf/wrappers.proto:
Add the following to your build.sbt:
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf"
This tells sbt-protoc to extract protos from this jar (and all its dependencies,
which includes Google's common protos), and make them available in the
include path that is passed to protoc.
It is important to add that by setting preserve_unknown_fields to false you are turning off a protobuf feature that could prevent data loss when different parts of a distributed system are not running the same version of the schema.
Related
Is it somehow possible to use an external library inside the build.sbt file?
E.g. I want to write something like this:
import scala.io.Source
import io.circe._ // not possible
version := myTask
lazy val myTask: String = {
val filename = "version.txt"
Source.fromFile(filename).getLines.mkString(", ")
// do some json parsing using the circe library
// ...
}
One of the things I actually like about sbt is that the build project is (in most ways) just another project (which is also potentially configured by a meta-build project configured by a meta-meta-build project, etc.). This means you can just drop the following line into a project/build.sbt file:
libraryDependencies += "io.circe" %% "circe-jawn" % "0.11.1"
You could also add this to plugins.sbt if you wanted, or any other .sbt file in the projects directory, since the filenames (excluding the extension) have no meaning beyond human convention, but I'd suggest following convention and going with build.sbt.
Note though that sbt implicitly imports sbt.io in .sbt files, so the circe import in your build.sbt (at the root level—i.e. the build config, not the build build config) will need to look like this:
import _root_.io.circe.jawn.decode
scalaVersion := decode[String]("\"2.12.8\"").right.get
(For anyone who hasn't seen it before, the _root_ here just means "start the package hierarchy here instead of assuming io is the imported one".)
I am using scalapb in a project that needs to have access to the FileDescriptorSet. Is there a way to have scalapb generate the .desc file in addition to all other class files? Or is there some programatic way of obtaining a FileDescriptorSet from what is already generated?
Yes, to both questions.
If you are using sbt-protoc, you can have the following definition in your SBT file:
PB.protocOptions in Compile := Seq(
"--descriptor_set_out=" +
(baseDirectory in Compile).value.getParentFile / "src" / "main" / "resources" /"out.desc"
)
One caveat is that you would have to create src/main/resources yourself, otherwise you would get an error. It would probably be better to generate into resourceManaged (that would also require creating a directory ahead of time, since protoc doesn't do that)
You can also build a FileDescriptorSet at run time. For each proto file, ScalaPB generates a Scala object with scalaDescriptor (and also javaDescriptor if that's more convenient). The descriptors contains a list of their dependencies which are also FileDesciptors - you can traverse that tree structure and build a list of FileDescriptors which is essentially a FileDescriptorSet.
For example, suppose I have a file project/CodeGeneration.scala that generates "managed" source-code, and suppose that object (CodeGeneration) needs to leverage a third-party library -- say jsoup...
import org.jsoup._
object CodeGeneration {
def generateCode = /* Generate code using jsoup... */
}
Simply adding a line for jsoup to your libraryDependencies in build.sbt doesn't do the trick; it leads to a compilation error complaining about the missing jsoup object/namespace.
So, (how) can one access this dependency from "meta" code -- code that generates other code?
It seems the solution is to leverage sbt's "recursive" nature, and put an additional build.sbt file in the project directory. So, for example, project/build.sbt might look like this:
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.2"
There's more detail in sbt's official documentation.
I am using WartRemover in a play project. I want to exclude routes file (generated code from it) from Wartremover scanning. I added following but it still scans play routes generated code.
wartremoverExcluded ++= Seq("com.xxx.controllers.ReverseMyController","com.xxx.controllers.javascript.ReverseMyController","com.xxx.controllers.ref.ReverseMyController")
And it still shows wart errors from the generated code for routes play file. for e.g.
warn] /xxx/conf/routes:23: Inferred type containing Nothing
warn] PUT /service/myendpoint com.xxx.controllers.MyController.postMyData
and same for many more routes defined in the routes file.
How to exclude routes from wartremover scan?
Have you tried putting -Xprint:typer in scalacOptions to see which package is the issue. It seems to work for me when I ignore the following
wartremoverExcluded ++= Seq("Routes", "controllers.ref")
It looks like this question was asked in the context of wartremover 0.11, but if anyone finds themselves here looking for a solution for 0.12, this works for me:
wartremoverExcluded += sourceManaged.value / "main" / "routes_reverseRouting.scala"
wartremoverExcluded += sourceManaged.value / "main" / "routes_routing.scala"
I am writing a simple app in Scala that uses a leveldb database through the leveldbjni library. My build.sbt file looks like this:
name := "Whatever"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
"org.iq80.leveldb" % "leveldb-api" % "0.6",
"org.fusesource.leveldbjni" % "leveldbjni-all" % "1.7"
)
An Object is then responsible for creating a database. Unfortunately if I run the program I get back a java.lang.UnsatisfiedLinkError, raised by the hawtjni library that leveldbjni exploits under the hood.
The error can be triggered easily also from the scala console:
scala> import java.io.File
scala> import org.iq80.leveldb._
scala> import org.fusesource.leveldbjni.JniDBFactory._
scala> factory.open(new File("test"), new Options().createIfMissing(true))
java.lang.UnsatisfiedLinkError: org.fusesource.leveldbjni.internal.NativeOptions.init()V
at org.fusesource.leveldbjni.internal.NativeOptions.init(Native Method)
at org.fusesource.leveldbjni.internal.NativeOptions.<clinit>(NativeOptions.java:54)
at org.fusesource.leveldbjni.JniDBFactory$OptionsResourceHolder.init(JniDBFactory.java:98)
at org.fusesource.leveldbjni.JniDBFactory.open(JniDBFactory.java:167)
at .<init>(<console>:15)
...
scala> System getProperty "java.io.tmpdir"
res2: String = /var/folders/1l/wj6yg_wd15sg_gcql001wchm0000gn/T/
I can't understand what is going on since the library is getting correctly extracted from the jar file but it is not getting loaded for some reasons.
$ file /var/folders/1l/wj6yg_wd15sg_gcql001wchm0000gn/T/lib*
/var/folders/1l/wj6yg_wd15sg_gcql001wchm0000gn/T/libleveldbjni-1.7.jnilib: Mach-O universal binary with 2 architectures
/var/folders/1l/wj6yg_wd15sg_gcql001wchm0000gn/T/libleveldbjni-1.7.jnilib (for architecture x86_64): Mach-O 64-bit dynamically linked shared library x86_64
/var/folders/1l/wj6yg_wd15sg_gcql001wchm0000gn/T/libleveldbjni-1.7.jnilib (for architecture i386): Mach-O dynamically linked shared library i386
I think the problem is probably related to the classloader that sbt employs but I am not sure since I am relatively new to scala.
UPDATE
Still didn't find what or who is the culprit. Anyway the library is actually found and correctly loaded, since I can execute the following commands:
scalac> import org.fusesource.leveldbjni.internal.NativeDB
scalac> NativeDB.LIBRARY.load()
The error is somehow due to the init() function that according to the hawtjni documentation is responsible for setting all the static fields annotated as constant fields with the constant value. The exception can still be triggered by typing:
scalac> import org.fusesource.leveldbjni.internal.NativeOptions
scalac> new NativeOptions()
java.lang.UnsatisfiedLinkError: org.fusesource.leveldbjni.internal.NativeOptions.init()V
at org.fusesource.leveldbjni.internal.NativeOptions.init(Native Method)
at org.fusesource.leveldbjni.internal.NativeOptions.<clinit>(NativeOptions.java:54)
at .<init>(<console>:9)
Apparently this is a known problem as documented in this sbt issue page. I have implemented, according to the eventsourced documentation, a custom run-nobootcp command that executes the code without adding the Scala library to the boot classpath.
This should solve the problem.