I'm new to scalapb and protobuf.
I'm trying to create unit test for my scalapb's generators. I've generated proto files and trying to use them in tests.
I've got this proto file:
syntax = "proto3";
package hellogrpc.calc;
import "google/api/annotations.proto";
option (scalapb.options) = {
flat_package: true
};
service CalcService {
rpc CalcSum (SumRequest) returns (CalcResponse) {
option (google.api.http) = {
post: "/calcService/sum"
body: "*"
};
}
}
There is method CalcSum which is annotated
And corresponding generated proto file:
// Generated by the Scala Plugin for the Protocol Buffer Compiler.
// Do not edit!
//
// Protofile syntax: PROTO3
package hellogrpc.calc
object CalcServiceProto extends _root_.com.trueaccord.scalapb.GeneratedFileObject {
lazy val dependencies: Seq[_root_.com.trueaccord.scalapb.GeneratedFileObject] = Seq(
com.trueaccord.scalapb.scalapb.ScalapbProto,
com.google.api.annotations.AnnotationsProto
)
lazy val messagesCompanions: Seq[_root_.com.trueaccord.scalapb.GeneratedMessageCompanion[_]] = Seq(
hellogrpc.calc.SumRequest,
hellogrpc.calc.CalcResponse
)
private lazy val ProtoBytes: Array[Byte] =
com.trueaccord.scalapb.Encoding.fromBase64(scala.collection.Seq(
"""ChtoZWxsb2dycGMvQ2FsY1NlcnZpY2UucHJvdG8SDmhlbGxvZ3JwYy5jYWxjGhVzY2FsYXBiL3NjYWxhcGIucHJvdG8aHGdvb
2dsZS9hcGkvYW5ub3RhdGlvbnMucHJvdG8iKAoKU3VtUmVxdWVzdBIMCgFhGAEgASgFUgFhEgwKAWIYAiABKAVSAWIiJgoMQ2FsY
1Jlc3BvbnNlEhYKBnJlc3VsdBgBIAEoBVIGcmVzdWx0Mm8KC0NhbGNTZXJ2aWNlEmAKB0NhbGNTdW0SGi5oZWxsb2dycGMuY2FsY
y5TdW1SZXF1ZXN0GhwuaGVsbG9ncnBjLmNhbGMuQ2FsY1Jlc3BvbnNlIhuC0+STAhUiEC9jYWxjU2VydmljZS9zdW06ASpCBeI/A
hABYgZwcm90bzM="""
).mkString)
lazy val scalaDescriptor: _root_.scalapb.descriptors.FileDescriptor = {
val scalaProto = com.google.protobuf.descriptor.FileDescriptorProto.parseFrom(ProtoBytes)
_root_.scalapb.descriptors.FileDescriptor.buildFrom(scalaProto, dependencies.map(_.scalaDescriptor))
}
lazy val javaDescriptor: com.google.protobuf.Descriptors.FileDescriptor = {
val javaProto = com.google.protobuf.DescriptorProtos.FileDescriptorProto.parseFrom(ProtoBytes)
com.google.protobuf.Descriptors.FileDescriptor.buildFrom(javaProto, Array(
com.trueaccord.scalapb.scalapb.ScalapbProto.javaDescriptor,
com.google.api.annotations.AnnotationsProto.javaDescriptor
))
}
#deprecated("Use javaDescriptor instead. In a future version this will refer to scalaDescriptor.", "ScalaPB 0.5.47")
def de
```scriptor: com.google.protobuf.Descriptors.FileDescriptor = javaDescriptor
}
I inspect CalcServiceProto.javaDescriptor in intellj idea:
Method descriptor has this proto definition:
name: "CalcSum"
input_type: ".hellogrpc.calc.SumRequest"
output_type: ".hellogrpc.calc.CalcResponse"
options {
72295728: "\"\020/calcService/sum:\001*"
}
But generator works just fine! I debug generator, toggled breakpoint on generator and method CalcSum has this proto definition:
name: "CalcSum"
input_type: ".hellogrpc.calc.SumRequest"
output_type: ".hellogrpc.calc.CalcResponse"
options {
[google.api.http] {
post: "/calcService/sum"
body: "*"
}
}
May be this works differently because i didn't register extensions like generator do.
Any way i want this test to be passed:
val s = CalcServiceProto.javaDescriptor.getServices.get(0)
val m = s.getMethods.get(0)
m.getOptions.getExtension(AnnotationsProto.http).getPost shouldBe "/calcService/sum"
If you need Java extensions to be available you need to generate your code with Java conversions enabled. This will make javaDescriptor rely on the official Java implementation and your test will pass.
When Java conversions are disabled, ScalaPB parses the the descriptor, but it can't guarantee the Java extensions are even compiled so it doesn't attempt to register them.
What I'd like to have is that the Scala descriptors would work in this case, however they are not supporting services and methods yet. I filed https://github.com/scalapb/ScalaPB/issues/382 to track progress on this.
In the mean time, like I wrote above, you can use java conversions to force ScalaPB to provide you the Java descriptor. In your build.sbt, have:
PB.targets in Compile := Seq(
scalapb.gen(grpc=true, javaConversions=true) -> (sourceManaged in Compile).value,
PB.gens.java -> (sourceManaged in Compile).value
)
Related
I'm trying to defer ScalaJS Linking to runtime, which allows multi-stage compilation to be more flexible and less dependent on sbt.
The setup looks like this:
Instead of using scalajs-sbt plugin, I chose to invoke scalajs-compiler directly as a scala compiler plugin:
scalaCompilerPlugins("org.scala-js:scalajs-compiler_${vs.scalaV}:${vs.scalaJSV}")
This can successfully generate the "sjsir" files under project output directory, but no further.
Use the solution in this post:
Build / Compile latest SalaJS (1.3+) using gradle on a windows machine?
"Linking scala.js yourself" to invoke the linker on all the compiled sjsir files to produce js files, this is my implementation:
in compile-time & runtime dependencies, add scalajs basics and scalajs-linker:
bothImpl("org.scala-js:scalajs-library_${vs.scalaBinaryV}:${vs.scalaJSV}")
bothImpl("org.scala-js:scalajs-linker_${vs.scalaBinaryV}:${vs.scalaJSV}")
bothImpl("org.scala-js:scalajs-dom_${vs.scalaJSSuffix}:2.1.0")
Write the following code:
import org.scalajs.linker.interface.{Report, StandardConfig}
import org.scalajs.linker.{PathIRContainer, PathOutputDirectory, StandardImpl}
import org.scalajs.logging.{Level, ScalaConsoleLogger}
import java.nio.file.{Path, Paths}
import java.util.Collections
import scala.concurrent.duration.Duration
import scala.concurrent.{Await, ExecutionContext}
object JSLinker {
implicit def gec = ExecutionContext.global
def link(classpath: Seq[Path], outputDir: Path): Report = {
val logger = new ScalaConsoleLogger(Level.Warn)
val linkerConfig = StandardConfig() // look at the API of this, lots of options.
val linker = StandardImpl.linker(linkerConfig)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq()
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap(linker.link(_, moduleInitializers, PathOutputDirectory(outputDir), logger))
Await.result(result, Duration.Inf)
}
def linkClasses(outputDir: Path = Paths.get("./")): Report = {
import scala.jdk.CollectionConverters._
val cl = Thread.currentThread().getContextClassLoader
val resources = cl.getResources("")
val rList = Collections.list(resources).asScala.toSeq.map { v =>
Paths.get(v.toURI)
}
link(rList, outputDir)
}
lazy val linkOnce = {
linkClasses()
}
}
The resources detection was successful, all roots containing sjsir are detected:
rList = {$colon$colon#1629} "::" size = 4
0 = {UnixPath#1917} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/test"
1 = {UnixPath#1918} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/testFixtures"
2 = {UnixPath#1919} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/main"
3 = {UnixPath#1920} "/home/peng/git-scaffold/scaffold-gradle-kts/build/resources/main"
But linking still fails:
Fatal error: java.lang.Object is missing
called from core module analyzer
There were linking errors
org.scalajs.linker.interface.LinkingException: There were linking errors
at org.scalajs.linker.frontend.BaseLinker.reportErrors$1(BaseLinker.scala:91)
at org.scalajs.linker.frontend.BaseLinker.$anonfun$analyze$5(BaseLinker.scala:100)
at scala.concurrent.impl.Promise$Transformation.run$$$capture(Promise.scala:467)
at scala.concurrent.impl.Promise$Transformation.run(Promise.scala)
at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
at java.util.concurrent.ForkJoinTask.doExec$$$capture(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
I wonder what this error message entails. Clearly java.lang.Object is not compiled into sjsir. Does this error message make sense? How do I fix it?
Thanks to #sjrd I now have the correct runtime compilation stack. There are 2 problems in my old settings:
It turns out that cl.getResources("") is indeed not able to infer all classpath, so I switch to system property java.class.path, which contains classpaths of all dependencies
moduleInitializers has to be manually set to point to a main method, which will be invoked when the js function is called.
After correcting them, the compilation class becomes:
import org.scalajs.linker.interface.{ModuleInitializer, Report, StandardConfig}
import org.scalajs.linker.{PathIRContainer, PathOutputDirectory, StandardImpl}
import org.scalajs.logging.{Level, ScalaConsoleLogger}
import java.nio.file.{Files, Path, Paths}
import scala.concurrent.duration.Duration
import scala.concurrent.{Await, ExecutionContext, ExecutionContextExecutor}
object JSLinker {
implicit def gec: ExecutionContextExecutor = ExecutionContext.global
val logger = new ScalaConsoleLogger(Level.Info) // TODO: cannot be lazy val, why?
lazy val linkerConf: StandardConfig = {
StandardConfig()
} // look at the API of this, lots of options.
def link(classpath: Seq[Path], outputDir: Path): Report = {
val linker = StandardImpl.linker(linkerConf)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq(
ModuleInitializer.mainMethodWithArgs(SlinkyHelloWorld.getClass.getName.stripSuffix("$"), "main")
)
Files.createDirectories(outputDir)
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap { v =>
linker.link(v, moduleInitializers, PathOutputDirectory(outputDir), logger)
}
Await.result(result, Duration.Inf)
}
def linkClasses(outputDir: Path = Paths.get("./ui/build/js")): Report = {
val rList = getClassPaths
link(rList, outputDir)
}
def getClassPaths: Seq[Path] = {
val str = System.getProperty("java.class.path")
val paths = str.split(':').map { v =>
Paths.get(v)
}
paths
}
lazy val linkOnce: Report = {
val report = linkClasses()
logger.info(
s"""
|=== [Linked] ===
|${report.toString()}
|""".stripMargin
)
report
}
}
This is all it takes to convert sjsir artefacts to a single main.js file.
I have a localization resource file I need access from scala.js. It needs to be local to the script execution environment (i.e., not loaded asynchronously from a server, as recommended at How to read a resource file in Scala.js?).
Is there any mechanism for embedding the contents of a small resource file directly into the generated javascript compiled from a scala.js file? Basically, I want something like:
object MyResource {
#EmbeddedResource(URL("/my/package/localized.txt"))
val resourceString: String = ???
}
This would obviously bloat the generated .js file somewhat, but that is an acceptable tradeoff for my application. It seems like this wouldn't be an uncommon need and that this macro ought to already exist somewhere, but my initial searches haven't turned anything up.
If you are using sbt, you can use a source generator that reads your resource file and serializes it in a val inside an object:
sourceGenerators in Compile += Def.task {
val baseDir = baseDirectory.value / "custom-resources" // or whatever
val resourceFile = baseDir / "my/package/localized.txt"
val sourceDir = (sourceManaged in Compile).value
val sourceFile = sourceDir / "Localized.scala"
if (!sourceFile.exists() ||
sourceFile.lastModified() < resourceFile.lastModified()) {
val content = IO.read(resourceFile).replaceAllLiterally("$", "$$")
val scalaCode =
s"""
package my.package.localized
object Localized {
final val content = raw\"\"\"$content\"\"\"
}
"""
IO.write(sourceFile, scalaCode)
}
Seq(sourceFile)
}.taskValue
If you are using another build tool, I am sure there is a similar concept of source generators that you can use.
I have a class that I use in playframework that automatically injects the dependency.
How can I create this class "manually" in my test code:
class AppConfog #Inject()(c: Configuration) {
val supportEmail = c.getString("app.email").get
...
}
I'm not sure how to get a Configuration class to pass into it.
I know I can create an inline config like:
val config =
"""
akka {
loglevel = "WARNING"
}
"""
ConfigFactory.parseString(config)
How do I get a Configuration from a config?
I think that Play's Configuration just wraps Typesafe Config that you get with your ConfigFactory.parseString(config). See here.
So you should be able to do this:
val underlying = ConfigFactory.parseString(config)
val configuration = Configuration(underlying)
val mockAppConfog = new AppConfog(configuration)
I new to Scala (2.10) and currently working on a POC to store some data inside HBase. To store the data I'm trying to use the scala pickling library to serialise my case classes into a binary format
"org.scala-lang.modules" %% "scala-pickling" % "0.10.1"
I have these two simple classes:
case class Location(source: Source,
country: String,
region: String,
metro: String,
postalcode: String) {
}
And
case class Source(name: String,
trust: Float,
created: String) {
/** compares this Source with the other source and returns the difference in their trust levels */
def compare(other: Source): Float = {
trust - other.trust
}
/** returns whether you should prefer this source (true) or the other source (false) */
def prefer(other: Source): Boolean = {
trust >= other.trust
}
}
object Source {
def apply(name: String, trust: Float) = new Source(name, trust, DateTime.now.toString)
def apply(row: Row) = {
val name = row.getAs[String](0)
val trust = row.getAs[Float](1)
val created = row.getAs[String](2)
new Source(name, trust, created)
}
}
And I'm testing out the serialisation using ScalaTest class
import scala.pickling._
import binary._
class DebuggingSpec extends UnitSpec {
"debugging" should "Allow the serialisation and deserialisation of a Link class" in {
val loc = new Location(Source("Source1", 1), "UK", "Wales", "Cardiff", "CF99 1PP")
val bytes = loc.pickle
bytes.value.length should not be(0)
}
it should "Allow the serialisation and deserialisation of a Location class" in {
val link = Link(Source("Source1", 1), "MyId1", 3)
val bytes = link.pickle
bytes.value.length should not be(0)
}
}
But when I compile this inside IntelliJ or on the command line via sbt package I get the following error message
Error:(12, 9) macro implementation not found: pickle (the most common reason for that is that you cannot use macro implementations in the same compilation run that defines them)
val bytes = loc.pickle
^
EDIT: I've run this code successfully in the spark-shell (1.3.1) and it will happily pickle and unpickle these classes... but identical code and import produce and error when compiling
I'm building an application Scala that uses a plugin architecture, and I'm trying to load plugins in at runtime. Currently, my plugin loader code is:
import org.clapper.classutil.ClassFinder
object PluginManager extends PluginManager {
val plugins = new mutable.HashMap[String, Plugin]()
val pluginFolder = new File("plugins")
def init(): Unit = {
val pluginJars = pluginFolder.listFiles.filter(_.getName.endsWith(".jar"))
val classpath = List(new File(".")) ++ pluginJars
val finder = ClassFinder(classpath)
val classes = finder.getClasses()
val classMap = ClassFinder.classInfoMap(classes.iterator)
val pluginsToLoad = ClassFinder.concreteSubclasses("org.narrativeandplay.hypedyn.plugins.Plugin", classMap)
val loader = new URLClassLoader(pluginJars.map({ f => new URL(s"file:${f.getAbsolutePath}") }), ClassLoader.getSystemClassLoader)
pluginsToLoad.foreach {
pluginString =>
val plugin = loader.loadClass(pluginString.name).newInstance().asInstanceOf[Plugin]
plugins += plugin.name -> plugin
}
}
}
(based on https://vikashazrati.wordpress.com/2011/09/15/building-a-plugin-based-architecture-in-scala/).
I had to use the URLClassLoader because my plugin JARs weren't on the classpath when the application started
I was wondering if it's possible to use the Scala Reflection API to replace my use of the URLClassLoader, and if so, how should I do so?
In order for the runtime reflection part of the Scala reflection API to work, you will anyway need to instantiate a classloader (and then do scala.reflect.runtime.universe.runtimeMirror(classloader)), so I think that it's not going to simplify matters in this situation.