sbt scala pb "Protobufs files found, but PB.targets is empty." - scala

I'm having project which converts proto files into scala case classes.
My build.sbt is -
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.settings(
name := "proto-path-error"
)
val openRtbCoreVersion = "1.5.5"
val googleCommonProtosVersion = "1.12.0"
val commonSettings: Seq[Def.Setting[_]] = Seq[Def.Setting[_]](
scalaVersion := "2.12.14",
organization := "com.td",
)
def scalaGrpcSettings: Seq[Def.Setting[_]] = Seq[Def.Setting[_]](
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapb.compiler.Version.scalapbVersion,
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf",
libraryDependencies += "com.google.api.grpc" % "proto-google-common-protos" % googleCommonProtosVersion % "protobuf",
libraryDependencies += "com.google.openrtb" % "openrtb-core" % openRtbCoreVersion,
libraryDependencies += "com.google.openrtb" % "openrtb-core" % openRtbCoreVersion % "protobuf",
PB.targets in Compile := Seq(
PB.gens.java -> (sourceManaged in Compile).value,
scalapb.gen(javaConversions = true) -> (sourceManaged in Compile).value
),
PB.includePaths in Compile := Seq(
target.value / "protobuf_external",
new File("definitions/common"),
),
PB.protoSources in Compile := Seq(
PB.externalIncludePath.value / "google" / "api",
new File("definitions/common"),
target.value / "protobuf_external"
),
excludeFilter in PB.generate := new SimpleFileFilter(file => file.getCanonicalPath.contains("google/protobuf")),
PB.additionalDependencies := Nil
)
project/plugins.sbt file is -
addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.6")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.10.11"
project/build.properties is -
sbt.version = 1.3.13
The src/main/protobuf/definitions/common/common.proto -
syntax = "proto2";
option java_outer_classname = "CommonUtils";
package com.td.protobuf;
message CMap {
map<int32, Assets> cs = 1;
}
message Assets {
repeated int32 assets = 1;
}
The whole project is on github here
When I compile the project it is not producting any case classes. I'm getting below output -
$ sbt clean compile
[info] welcome to sbt 1.3.13 (Azul Systems, Inc. Java 11.0.15)
[info] loading settings for project global-plugins from build.sbt ...
[info] loading global plugins from ~/.sbt/1.0/plugins
[info] loading settings for project proto-path-error-build from plugins.sbt ...
[info] loading project definition from ~/Documents/Coding/other/proto-path-error/project
[info] loading settings for project root from build.sbt ...
[info] set current project to proto-path-error (in build file:~/Documents/Coding/other/proto-path-error/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 0 s, completed Jul 5, 2022, 9:17:03 PM
[info] Protobufs files found, but PB.targets is empty.
[success] Total time: 0 s, completed Jul 5, 2022, 9:17:04 PM
I've 2 issues
PB.protoSources in Compile := new File("definitions/common") in build.sbt is not getting recognized.so I moved from proto-path-error/definitions to proto-path-error/src/main/protobuf/definitions.
After moving the definitions it is recognized. But scala cases classes is not getting generated. I get log [info] Protobufs files found, but PB.targets is empty.
How can I fix these 2 issues?

Related

How to set the Docker Registry with sbt-native-packager

I'm trying to build a Docker image using sbt-native-packager with the following build.sbt (trying to publish the image to a local repository)
val sparkVersion = "2.4.5"
scalaVersion in ThisBuild := "2.12.0"
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
// JAR build settings
lazy val commonSettings = Seq(
organization := "dzlab",
version := "0.1",
scalaSource in Compile := baseDirectory.value / "src",
scalaSource in Test := baseDirectory.value / "test",
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javacOptions ++= Seq(),
scalacOptions ++= Seq(
"-deprecation",
"-feature",
"-language:implicitConversions",
"-language:postfixOps"
),
libraryDependencies ++= sparkLibs
)
// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion
lazy val root = (project in file("."))
.enablePlugins(
DockerPlugin,
JavaAppPackaging
)
.settings(
name := "spark-k8s",
commonSettings,
dockerAliases ++= Seq(
dockerAlias.value.withRegistryHost(Some("localhost:5000"))
),
mainClass in (Compile, run) := Some("dzlab.SparkJob")
)
SBT and the packager versions
$ cat project/plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
$ cat project/build.properties
sbt.version=0.13.18
When I try to run the packager
$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
dockerAliases ++= Seq(
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
It does not recognize dockerAliases not sure why as it is part of the publishing settings.
What is the proper way to set the Docker registry??
Your sbt-native-packager version is hopelessly outdated, as is your sbt version. That SettingKey doesn't exist in that version.
Compare: sbt-native-packager 1.0 vs. sbt-native-packager 1.7.4

SBT sync throws an error "Cannot add dependency" on intellij

I get the following error on IntelliJ why I run sbt sync
[error] stack trace is suppressed; run 'last ProjectRef(uri("file:/Users/tushar/Documents/Projects/zio-rocksdb/"), "zio-rocksdb") / updateSbtClassifiers' for the full output
[error] (ProjectRef(uri("file:/Users/tushar/Documents/Projects/zio-rocksdb/"), "zio-rocksdb") / updateSbtClassifiers) java.lang.IllegalArgumentException: Cannot add dependency 'com.typesafe#ssl-config-core_2.12;0.4.0' to configuration 'default' of module dev.zio#zio-rocksdb$sbt_2.12;0.2.0+3-114b41b9+20200418-1131 because this configuration doesn't exist!
[error] Total time: 2 s, completed 18-Apr-2020, 11:31:05 AM
build.sbt
name := "scala-interview-scheduler"
version := "0.1"
scalaVersion := "2.13.1"
// ZIO Core
lazy val zioVersion = "1.0.0-RC18"
// Project Scheduler
lazy val scheduler =
(project in file("scheduler"))
// Project Storage
lazy val storage = (project in file("storage"))
.settings(
libraryDependencies ++= Seq(
"io.suzaku" %% "boopickle" % "1.3.1"
)
)
.dependsOn(ProjectRef(file("../zio-rocksdb"), "zio-rocksdb"))
// Project Program
lazy val program = (project in file("program"))
.dependsOn(scheduler)
// Testing
ThisBuild / libraryDependencies ++= Seq(
"dev.zio" %% "zio" % zioVersion,
"dev.zio" %% "zio-test" % zioVersion % "test",
"dev.zio" %% "zio-test-sbt" % zioVersion % "test"
)
ThisBuild / testFrameworks += new TestFramework("zio.test.sbt.ZTestFramework")
// Global Options
ThisBuild / scalacOptions ++= Seq(
"-language:postfixOps",
"-language:implicitConversions",
"-feature"
)
What does it mean?

Cross Publishing of SBT Plugins not working

My objective is to write an SBT plugin which can be used by SBT 0.13.x and 1.x versions of SBT. Based on this thread and this documentation. I wrote the following build.sbt for my plugin project
lazy val foo = (project in file(".")).settings(
name := "foo",
sbtPlugin := true,
organization := "com.bar",
version := "1.0.0",
scalaVersion:= "2.12.4",
sbtVersion in Global := "1.0.0",
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
when I do sbt +publichLocal I see
info] Packaging /Users/user1/IdeaProjects/fulfillment-sbt/target/scala-2.12/sbt-0.13/foo-1.0.0-javadoc.jar ...
[info] Done packaging.
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/poms/foo.pom
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/jars/foo.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/srcs/foo-sources.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/docs/foo-javadoc.jar
[info] published ivy to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/ivys/ivy.xml
[success] Total time: 9 s, completed Apr 4, 2018 11:12:38 AM
But it didn't publish for 1.0 version of SBT. what can I do that it publishes for both versions of SBT?
I went to the gitter channel of SBT and had a conversation there with the creators of SBT. Based on that conversation I created a working example. I am listing it here so that it helps someone cross publish sbt plugins in future.
project/build.properties
sbt.version=0.13.17
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
And finally in order to cross publish SBT plugins one has to do
sbt ^publishLocal
Wow. didn't know about the ^ the sbt +publishLocal is for cross publishing normal binaries not for plugins. for cross publishing of sbt plugins, we must do sbt ^publishLocal.
One thing to note is that the scalaCompilerBridgeSource thing is only needed if you are working on SBT 0.13.17. If you upgrade to SBT 1.1.0 in the plugin project then the code is simplified.
project/build.properties
sbt.version=1.1.2
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
)
)

How to add input files in sbt Scala

I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}

sbt sometimes takes long to compile (super long..)

Currently I'm running a fairly middle-sized project with the Play Framework and sbt/scala. However, sometimes it takes long time to compile the project, like 4 classes taking something like 120 seconds, while most of the time the project takes just a bunch of seconds for everything!
However mostly it looks that it takes a while longer for a bunch of files when refactoring some things inside my Database Access and Services, while creating private / public functions of slick code without return codes.
Still is there anything to do against this ?
It just looks like sbt is hanging while doing:
[info] Loading project definition from /Users/schmitch/projects/envisia/envisia-erp-loki/project
[info] Set current project to envisia-erp-loki (in build file:/Users/schmitch/projects/envisia/envisia-erp-loki/)
[info] Compiling 4 Scala sources to /Users/schmitch/projects/envisia/envisia-erp-loki/modules/utils/target/scala-2.11/classes...
while utils is a sbt subproject. Any idea's why it takes so long? and mostly while that only happens sometimes?
Here is my sbt file:
name := "envisia-erp-loki"
version := "1.0.1-SNAPSHOT"
lazy val utils = project in file("modules/utils")
lazy val migrator = (project in file("modules/migrator"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val codegen = (project in file("modules/codegen"))
.dependsOn(utils).aggregate(utils)
lazy val auth = (project in file("modules/auth"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val pdfgen = project in file("modules/pdfgen")
lazy val root = (project in file("."))
.enablePlugins(PlayScala, DebianPlugin)
.dependsOn(utils).aggregate(utils)
.dependsOn(auth).aggregate(auth)
.dependsOn(pdfgen).aggregate(pdfgen)
scalaVersion in ThisBuild := "2.11.6"
libraryDependencies ++= Seq(
filters,
cache,
ws,
specs2 % Test,
"com.typesafe.play" %% "play-slick" % "1.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "1.0.0",
"com.typesafe.play" %% "play-mailer" % "3.0.1",
"org.elasticsearch" % "elasticsearch" % "1.5.2",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"com.chuusai" %% "shapeless" % "2.2.0",
"com.nulab-inc" %% "play2-oauth2-provider" % "0.15.0",
"io.github.nremond" %% "pbkdf2-scala" % "0.4",
"com.google.code.findbugs" % "jsr305" % "3.0.0"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots")
)
// Faster compilations:
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// "com.googlecode.java-diff-utils" % "diffutils" % "1.3.0"
//pipelineStages := Seq(rjs)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
JsEngineKeys.engineType := JsEngineKeys.EngineType.Node
// Do not add API Doc
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// RpmPlugin
maintainer in Linux := "Christian Schmitt <c.schmitt#envisia.de>"
packageSummary in Linux := "Envisia ERP Server 3.0"
packageDescription := "This is the new Envisia ERP Server it will be as standalone as possible"
import com.typesafe.sbt.packager.archetypes.ServerLoader.Systemd
serverLoading in Debian := Systemd
scalacOptions in ThisBuild ++= Seq(
"-target:jvm-1.8",
"-encoding", "UTF-8",
"-deprecation", // warning and location for usages of deprecated APIs
"-feature", // warning and location for usages of features that should be imported explicitly
"-unchecked", // additional warnings where generated code depends on assumptions
"-Xlint", // recommended additional warnings
"-Ywarn-adapted-args", // Warn if an argument list is modified to match the receiver
"-Ywarn-value-discard", // Warn when non-Unit expression results are unused
"-Ywarn-inaccessible",
"-Ywarn-dead-code"
)
updateOptions in ThisBuild := updateOptions.value.withCachedResolution(true)