How to solve problem with compile protobuf in scala project on Windows10? - scala

I have simple code with example protobuf on scala
proto file
syntax = "proto3";
package grpc.example;
message HelloRequest {
string msg = 1;
int32 code = 2;
}
message HelloResponse {
string msg = 1;
}
service HelloWorld {
rpc hello (HelloRequest) returns (HelloResponse);
}
plugins.sbt in project folder
addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.0")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.10.10"
build.sbt
name := "custom-grpc"
version := "0.1"
scalaVersion := "2.13.4"
lazy val protoExample = (project in file("proto-example"))
.settings(
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value / "scalapb"
),
libraryDependencies ++= Seq(
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf"
),
scalaVersion := "2.13.4"
)
lazy val root = (project in file("."))
.aggregate(protoExample)
When in sbt shell call command compile I have error
[info] Compiling schema C:\Users\<.....>\IdeaProjects\LearnScala\custom-grpc\proto-example\src\main\protobuf\hello.proto
Error: Could not find or load main class protocbridge.frontend.BridgeApp
--jvm_0_out: protoc-gen-jvm_0: Plugin failed with status code 1.
What is error ???
I am use Windows10 and jdk 1.8 and then jdk 11.

Related

Scala Flink get java.lang.NoClassDefFoundError: scala/Product$class after using case class for customized DeserializationSchema

It work fine when using generic class.
But get java.lang.NoClassDefFoundError: scala/Product$class error after change class to case class.
Not sure is sbt packaging problem or code problem.
When I'm using:
sbt
scala: 2.11.12
java: 8
sbt assembly to package
package example
import java.util.Properties
import java.nio.charset.StandardCharsets
import org.apache.flink.api.scala._
import org.apache.flink.streaming.util.serialization.{DeserializationSchema, SerializationSchema}
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer, FlinkKafkaProducer}
import org.apache.flink.streaming.api.watermark.Watermark
import org.apache.flink.streaming.api.functions.AssignerWithPunctuatedWatermarks
import org.apache.flink.api.common.typeinfo.TypeInformation
import Config._
case class Record(
id: String,
startTime: Long
) {}
class RecordDeSerializer extends DeserializationSchema[Record] with SerializationSchema[Record] {
override def serialize(record: Record): Array[Byte] = {
return "123".getBytes(StandardCharsets.UTF_8)
}
override def deserialize(b: Array[Byte]): Record = {
Record("1", 123)
}
override def isEndOfStream(record: Record): Boolean = false
override def getProducedType: TypeInformation[Record] = {
createTypeInformation[Record]
}
}
object RecordConsumer {
def main(args: Array[String]): Unit = {
val config : Properties = {
var p = new Properties()
p.setProperty("zookeeper.connect", Config.KafkaZookeeperServers)
p.setProperty("bootstrap.servers", Config.KafkaBootstrapServers)
p.setProperty("group.id", Config.KafkaGroupID)
p
}
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.enableCheckpointing(1000)
var consumer = new FlinkKafkaConsumer[Record](
Config.KafkaTopic,
new RecordDeSerializer(),
config
)
consumer.setStartFromEarliest()
val stream = env.addSource(consumer).print
env.execute("record consumer")
}
}
Error
2020-08-05 04:07:33,963 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Discarding checkpoint 1670 of job 4de8831901fa72790d0a9a973cc17dde.
java.lang.NoClassDefFoundError: scala/Product$class
...
build.SBT
First idea is that maybe version is not right.
But every thing work fine if use normal class
Here is build.sbt
ThisBuild / resolvers ++= Seq(
"Apache Development Snapshot Repository" at "https://repository.apache.org/content/repositories/snapshots/",
Resolver.mavenLocal
)
name := "deedee"
version := "0.1-SNAPSHOT"
organization := "dexterlab"
ThisBuild / scalaVersion := "2.11.8"
val flinkVersion = "1.8.2"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-java" % flinkVersion % "provided",
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion,
)
val thirdPartyDependencies = Seq(
"com.github.nscala-time" %% "nscala-time" % "2.24.0",
"com.typesafe.play" %% "play-json" % "2.6.14",
)
lazy val root = (project in file(".")).
settings(
libraryDependencies ++= flinkDependencies,
libraryDependencies ++= thirdPartyDependencies,
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value,
)
assembly / mainClass := Some("dexterlab.TelecoDataConsumer")
// make run command include the provided dependencies
Compile / run := Defaults.runTask(Compile / fullClasspath,
Compile / run / mainClass,
Compile / run / runner
).evaluated
// stays inside the sbt console when we press "ctrl-c" while a Flink programme executes with "run" or "runMain"
Compile / run / fork := true
Global / cancelable := true
// exclude Scala library from assembly
assembly / assemblyOption := (assembly / assemblyOption).value.copy(includeScala = false)
autoCompilerPlugins := true
Finally success after I add this line in build.sbt
assembly / assemblyOption := (assemblu / assemblyOption).value.copy(includeScala = true)
To include scala library when running sbt assembly

Understanding build.sbt with sbt-spark-package plugin

I am new the scala and SBT build files. From the introductory tutorials adding spark dependencies to a scala project should be straight-forward via the sbt-spark-package plugin but I am getting the following error:
[error] (run-main-0) java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
Please provide resources to learn more about what could be driving error as I want to understand process more thoroughly.
CODE:
trait SparkSessionWrapper {
lazy val spark: SparkSession = {
SparkSession
.builder()
.master("local")
.appName("spark citation graph")
.getOrCreate()
}
val sc = spark.sparkContext
}
import org.apache.spark.graphx.GraphLoader
object Test extends SparkSessionWrapper {
def main(args: Array[String]) {
println("Testing, testing, testing, testing...")
var filePath = "Desktop/citations.txt"
val citeGraph = GraphLoader.edgeListFile(sc, filepath)
println(citeGraph.vertices.take(1))
}
}
plugins.sbt
resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"
addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")
build.sbt -- WORKING. Why does libraryDependencies run/work ?
spName := "yewno/citation_graph"
version := "0.1"
scalaVersion := "2.11.12"
sparkVersion := "2.2.0"
sparkComponents ++= Seq("core", "sql", "graphx")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0",
"org.apache.spark" %% "spark-sql" % "2.2.0",
"org.apache.spark" %% "spark-graphx" % "2.2.0"
)
build.sbt -- NOT WORKING. Would expect this to compile & run correctly
spName := "yewno/citation_graph"
version := "0.1"
scalaVersion := "2.11.12"
sparkVersion := "2.2.0"
sparkComponents ++= Seq("core", "sql", "graphx")
Bonus for explanation + links to resources to learn more about SBT build process, jar files, and anything else that can help me get up to speed!
sbt-spark-package plugin provides dependencies in provided scope:
sparkComponentSet.map { component =>
"org.apache.spark" %% s"spark-$component" % sparkVersion.value % "provided"
}.toSeq
We can confirm this by running show libraryDependencies from sbt:
[info] * org.scala-lang:scala-library:2.11.12
[info] * org.apache.spark:spark-core:2.2.0:provided
[info] * org.apache.spark:spark-sql:2.2.0:provided
[info] * org.apache.spark:spark-graphx:2.2.0:provided
provided scope means:
The dependency will be part of compilation and test, but excluded from
the runtime.
Thus sbt run throws java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
If we really want to include provided dependencies on run classpath then #douglaz suggests:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated

SBT published jar corrupt

I am having a really strange problem:
I have a multi-module sbt project containing libraries that I publish locally.
Every once in a while the published jar is really a POM file!
Meaning I can open it with a text editor and see the POM for the library in question instead of the class files.
cleaning and rebuilding/republishing sometimes fixes it temporarily.
Also note that the jar in 'target/scala-2.11' is also a POM file.
Here is the Build.scala and sbt file for the libraries. (names are changed to protect the guilty :) due to company policies)
//////////////////Build.scala
import java.io._
import java.nio.file.{Paths, Files}
import sbt._
import xerial.sbt.Pack._
object Library extends Build {
lazy val rootProj = Project(id = "library", base = file(".")) aggregate(
projA,
projB,
projC,
projD
)
lazy val projA = Project(id = "projectA", base = file("ProjectA"))
lazy val projB = Project(id = "projectB", base = file("ProjectB"))
lazy val projC = Project(id = "projectC", base = file("ProjectC"))
.dependsOn(spatialMathProj)
lazy val projD = Project(id = "projectD", base = file("ProjectD"))
}
//////////////////build.sbt for projectD
name := "ProjectD"
version := "2.0-SNAPSHOT"
scalaVersion := "2.11.8"
publishArtifact in (Compile, packageDoc) := false
publishArtifact in (Compile, packageSrc) := false
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.4.2"
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.4.2"
libraryDependencies += "net.sf.jung" % "jung2" % "2.0.1"
libraryDependencies += "net.sf.jung" % "jung-graph-impl" % "2.0.1"
libraryDependencies += "net.sf.jung" % "jung-visualization" % "2.0.1"
libraryDependencies += "com.plexsys" % "api_2.11" % "2.0-SNAPSHOT"
Has anyone else seen this?
Why is this happening?
I am using sbt version 0.13.7.
Thanks for any help you can offer.

Resolving ScalaJSPlugin from Build.scala and plugins.sbt

I'm trying to make "ScalaJSPlugin" work, to be resolved in:
project/Build.scala
object BuildProject extends Build {
..
lazy val scalaRx = Project(id = "ScalaRX", base = file("scalarx")).enablePlugins(ScalaJSPlugin).settings(
version := "0.1",
scalaVersion := "2.11.7",
libraryDependencies ++= scalaRxDependencies
) ...
In my project/plugins.sbt file, I put.
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.8")
It fails with with compilation error in Build.scala trying to resolve "ScalaJSPlugin"
There are links to the repo 1, 2
for now I keep that changes commented.

Adding module dependency information in sbt's build.sbt file

I have a multi module project in IntelliJ, as in this screen capture shows, contexProcessor module depends on contextSummary module.
IntelliJ takes care of everything once I setup the dependencies in Project Structure.
However, when I run sbt test with the following setup in build.sbt, I got an error complaining that it can't find the packages in contextSummary module.
name := "contextProcessor"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
How to teach sbt that the missing modules are found?
I could use the build.sbt file in the main root directory.
lazy val root = (project in file(".")).aggregate(contextSummary, contextProcessor)
lazy val contextSummary = project
lazy val contextProcessor = project.dependsOn(contextSummary)
Reference: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html
For testing only one project, I can use project command in sbt.
> sbt
[info] Set current project to root (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> project contextProcessor
[info] Set current project to contextProcessor (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> test
For batch mode as in How to pass command line args to program in SBT 0.13.1?
sbt "project contextProcessor" test
I think a simple build.sbt might not be enough for that.
You would need to create a more sophisticated project/Build.scala like that:
import sbt._
import sbt.Keys._
object Build extends Build {
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(module1, module2)
)
lazy val module1 = Project(
id = "module1",
base = file("module1-folder"),
settings = Seq(
name := "Module 1",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
lazy val module2 = Project(
id = "module2",
base = file("module2-folder"),
dependencies = Seq(module1),
settings = Seq(
name := "Module 2",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
}