My objective is to write an SBT plugin which can be used by SBT 0.13.x and 1.x versions of SBT. Based on this thread and this documentation. I wrote the following build.sbt for my plugin project
lazy val foo = (project in file(".")).settings(
name := "foo",
sbtPlugin := true,
organization := "com.bar",
version := "1.0.0",
scalaVersion:= "2.12.4",
sbtVersion in Global := "1.0.0",
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
when I do sbt +publichLocal I see
info] Packaging /Users/user1/IdeaProjects/fulfillment-sbt/target/scala-2.12/sbt-0.13/foo-1.0.0-javadoc.jar ...
[info] Done packaging.
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/poms/foo.pom
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/jars/foo.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/srcs/foo-sources.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/docs/foo-javadoc.jar
[info] published ivy to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/ivys/ivy.xml
[success] Total time: 9 s, completed Apr 4, 2018 11:12:38 AM
But it didn't publish for 1.0 version of SBT. what can I do that it publishes for both versions of SBT?
I went to the gitter channel of SBT and had a conversation there with the creators of SBT. Based on that conversation I created a working example. I am listing it here so that it helps someone cross publish sbt plugins in future.
project/build.properties
sbt.version=0.13.17
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
And finally in order to cross publish SBT plugins one has to do
sbt ^publishLocal
Wow. didn't know about the ^ the sbt +publishLocal is for cross publishing normal binaries not for plugins. for cross publishing of sbt plugins, we must do sbt ^publishLocal.
One thing to note is that the scalaCompilerBridgeSource thing is only needed if you are working on SBT 0.13.17. If you upgrade to SBT 1.1.0 in the plugin project then the code is simplified.
project/build.properties
sbt.version=1.1.2
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
)
)
Related
I'm having project which converts proto files into scala case classes.
My build.sbt is -
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.settings(
name := "proto-path-error"
)
val openRtbCoreVersion = "1.5.5"
val googleCommonProtosVersion = "1.12.0"
val commonSettings: Seq[Def.Setting[_]] = Seq[Def.Setting[_]](
scalaVersion := "2.12.14",
organization := "com.td",
)
def scalaGrpcSettings: Seq[Def.Setting[_]] = Seq[Def.Setting[_]](
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapb.compiler.Version.scalapbVersion,
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf",
libraryDependencies += "com.google.api.grpc" % "proto-google-common-protos" % googleCommonProtosVersion % "protobuf",
libraryDependencies += "com.google.openrtb" % "openrtb-core" % openRtbCoreVersion,
libraryDependencies += "com.google.openrtb" % "openrtb-core" % openRtbCoreVersion % "protobuf",
PB.targets in Compile := Seq(
PB.gens.java -> (sourceManaged in Compile).value,
scalapb.gen(javaConversions = true) -> (sourceManaged in Compile).value
),
PB.includePaths in Compile := Seq(
target.value / "protobuf_external",
new File("definitions/common"),
),
PB.protoSources in Compile := Seq(
PB.externalIncludePath.value / "google" / "api",
new File("definitions/common"),
target.value / "protobuf_external"
),
excludeFilter in PB.generate := new SimpleFileFilter(file => file.getCanonicalPath.contains("google/protobuf")),
PB.additionalDependencies := Nil
)
project/plugins.sbt file is -
addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.6")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.10.11"
project/build.properties is -
sbt.version = 1.3.13
The src/main/protobuf/definitions/common/common.proto -
syntax = "proto2";
option java_outer_classname = "CommonUtils";
package com.td.protobuf;
message CMap {
map<int32, Assets> cs = 1;
}
message Assets {
repeated int32 assets = 1;
}
The whole project is on github here
When I compile the project it is not producting any case classes. I'm getting below output -
$ sbt clean compile
[info] welcome to sbt 1.3.13 (Azul Systems, Inc. Java 11.0.15)
[info] loading settings for project global-plugins from build.sbt ...
[info] loading global plugins from ~/.sbt/1.0/plugins
[info] loading settings for project proto-path-error-build from plugins.sbt ...
[info] loading project definition from ~/Documents/Coding/other/proto-path-error/project
[info] loading settings for project root from build.sbt ...
[info] set current project to proto-path-error (in build file:~/Documents/Coding/other/proto-path-error/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 0 s, completed Jul 5, 2022, 9:17:03 PM
[info] Protobufs files found, but PB.targets is empty.
[success] Total time: 0 s, completed Jul 5, 2022, 9:17:04 PM
I've 2 issues
PB.protoSources in Compile := new File("definitions/common") in build.sbt is not getting recognized.so I moved from proto-path-error/definitions to proto-path-error/src/main/protobuf/definitions.
After moving the definitions it is recognized. But scala cases classes is not getting generated. I get log [info] Protobufs files found, but PB.targets is empty.
How can I fix these 2 issues?
I'm trying to build a Docker image using sbt-native-packager with the following build.sbt (trying to publish the image to a local repository)
val sparkVersion = "2.4.5"
scalaVersion in ThisBuild := "2.12.0"
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
// JAR build settings
lazy val commonSettings = Seq(
organization := "dzlab",
version := "0.1",
scalaSource in Compile := baseDirectory.value / "src",
scalaSource in Test := baseDirectory.value / "test",
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javacOptions ++= Seq(),
scalacOptions ++= Seq(
"-deprecation",
"-feature",
"-language:implicitConversions",
"-language:postfixOps"
),
libraryDependencies ++= sparkLibs
)
// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion
lazy val root = (project in file("."))
.enablePlugins(
DockerPlugin,
JavaAppPackaging
)
.settings(
name := "spark-k8s",
commonSettings,
dockerAliases ++= Seq(
dockerAlias.value.withRegistryHost(Some("localhost:5000"))
),
mainClass in (Compile, run) := Some("dzlab.SparkJob")
)
SBT and the packager versions
$ cat project/plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
$ cat project/build.properties
sbt.version=0.13.18
When I try to run the packager
$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
dockerAliases ++= Seq(
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
It does not recognize dockerAliases not sure why as it is part of the publishing settings.
What is the proper way to set the Docker registry??
Your sbt-native-packager version is hopelessly outdated, as is your sbt version. That SettingKey doesn't exist in that version.
Compare: sbt-native-packager 1.0 vs. sbt-native-packager 1.7.4
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
I am looking for something similar to dependencies tag in pom.xml with Maven build.
Will it make a difference if I use separate build.sbt for each of the child modules?
Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings file for the dependencies.
Using provided->provided in the dependsOn helped me solve a similar problem:
So something like:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}
Currently I'm running a fairly middle-sized project with the Play Framework and sbt/scala. However, sometimes it takes long time to compile the project, like 4 classes taking something like 120 seconds, while most of the time the project takes just a bunch of seconds for everything!
However mostly it looks that it takes a while longer for a bunch of files when refactoring some things inside my Database Access and Services, while creating private / public functions of slick code without return codes.
Still is there anything to do against this ?
It just looks like sbt is hanging while doing:
[info] Loading project definition from /Users/schmitch/projects/envisia/envisia-erp-loki/project
[info] Set current project to envisia-erp-loki (in build file:/Users/schmitch/projects/envisia/envisia-erp-loki/)
[info] Compiling 4 Scala sources to /Users/schmitch/projects/envisia/envisia-erp-loki/modules/utils/target/scala-2.11/classes...
while utils is a sbt subproject. Any idea's why it takes so long? and mostly while that only happens sometimes?
Here is my sbt file:
name := "envisia-erp-loki"
version := "1.0.1-SNAPSHOT"
lazy val utils = project in file("modules/utils")
lazy val migrator = (project in file("modules/migrator"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val codegen = (project in file("modules/codegen"))
.dependsOn(utils).aggregate(utils)
lazy val auth = (project in file("modules/auth"))
.enablePlugins(PlayScala).disablePlugins(PlayLayoutPlugin)
.dependsOn(utils).aggregate(utils)
lazy val pdfgen = project in file("modules/pdfgen")
lazy val root = (project in file("."))
.enablePlugins(PlayScala, DebianPlugin)
.dependsOn(utils).aggregate(utils)
.dependsOn(auth).aggregate(auth)
.dependsOn(pdfgen).aggregate(pdfgen)
scalaVersion in ThisBuild := "2.11.6"
libraryDependencies ++= Seq(
filters,
cache,
ws,
specs2 % Test,
"com.typesafe.play" %% "play-slick" % "1.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "1.0.0",
"com.typesafe.play" %% "play-mailer" % "3.0.1",
"org.elasticsearch" % "elasticsearch" % "1.5.2",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"com.chuusai" %% "shapeless" % "2.2.0",
"com.nulab-inc" %% "play2-oauth2-provider" % "0.15.0",
"io.github.nremond" %% "pbkdf2-scala" % "0.4",
"com.google.code.findbugs" % "jsr305" % "3.0.0"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots")
)
// Faster compilations:
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// "com.googlecode.java-diff-utils" % "diffutils" % "1.3.0"
//pipelineStages := Seq(rjs)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
JsEngineKeys.engineType := JsEngineKeys.EngineType.Node
// Do not add API Doc
sources in(Compile, doc) := Seq.empty
publishArtifact in(Compile, packageDoc) := false
// RpmPlugin
maintainer in Linux := "Christian Schmitt <c.schmitt#envisia.de>"
packageSummary in Linux := "Envisia ERP Server 3.0"
packageDescription := "This is the new Envisia ERP Server it will be as standalone as possible"
import com.typesafe.sbt.packager.archetypes.ServerLoader.Systemd
serverLoading in Debian := Systemd
scalacOptions in ThisBuild ++= Seq(
"-target:jvm-1.8",
"-encoding", "UTF-8",
"-deprecation", // warning and location for usages of deprecated APIs
"-feature", // warning and location for usages of features that should be imported explicitly
"-unchecked", // additional warnings where generated code depends on assumptions
"-Xlint", // recommended additional warnings
"-Ywarn-adapted-args", // Warn if an argument list is modified to match the receiver
"-Ywarn-value-discard", // Warn when non-Unit expression results are unused
"-Ywarn-inaccessible",
"-Ywarn-dead-code"
)
updateOptions in ThisBuild := updateOptions.value.withCachedResolution(true)