I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
I am looking for something similar to dependencies tag in pom.xml with Maven build.
Will it make a difference if I use separate build.sbt for each of the child modules?
Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings file for the dependencies.
Using provided->provided in the dependsOn helped me solve a similar problem:
So something like:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Related
I wrote macro that is working after compilation, but the problem is that Intellij IDEA doesn't see my generated code and some red lines appear. So i found explanation here, that I need to write IDEA plugin that will allow IDEA recognize my generated code. The problem is that i cannot use SyntheticMembersInjector because of missing dependency. Is it possible to write IDEA plugin for my own scala macros?
my plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.2")
addSbtPlugin("org.jetbrains" % "sbt-idea-plugin" % "3.8.4")
my build.sbt:
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport.dockerExposedPorts
import sbt.Keys.{scalacOptions, version}
lazy val coreProject = (project in file("."))
.enablePlugins(JavaAppPackaging, DockerPlugin, AkkaGrpcPlugin)
.settings(
scalaVersion := "2.12.12",
name := "CDMS",
version := "0.1",
libraryDependencies ++= BuildConfig.projectDependencies,
dockerBaseImage := "adoptopenjdk/openjdk15:alpine",
dockerExposedPorts += 9002
)
.dependsOn(validationProject)
lazy val validationProject = (project in file("validation"))
.enablePlugins(SbtPlugin)
.settings(
scalaVersion := "2.12.12",
sbtPlugin := true,
libraryDependencies ++= BuildConfig.monocleDependencies
)
I am currently developing a compiler plugin for a domain-specific language (DSL) that I wrote in Scala. As a result, I have one "examples" directory (SBT subproject) in my repo that contains a sample program written using the DSL. I can compile this subproject fine, and it works as expected.
My plugin analyzes this sample program and produces an output file (generated by simply writing out to a file). I have created a new subproject for the same "examples" directory. It has special command line options that instruct the compiler to use the plugin. Everything runs fine, but despite numerous searches, I haven't been able to figure out where the output file is generated. I suspect that it is being tossed.
Below is my build.sbt:
lazy val commonSettings = Seq(
organization := "com.bitbucket.bitstream-dsl",
scalaVersion := "2.12.6"
)
lazy val root = (project in file("."))
.settings(
commonSettings,
version := "0.1.0-SNAPSHOT",
name := "example-project"
)
lazy val examples = (project in file("examples"))
.settings(
commonSettings,
name := "examples"
)
.dependsOn(root)
lazy val examplesPlugin = (project in file("examples"))
.settings(
commonSettings,
scalacOptions += "-Xplugin:plugin/target/scala-2.12/plugin_2.12-0.1-SNAPSHOT.jar",
name := "examples_plugin",
target := baseDirectory.value / "target-plugin",
publishArtifact in Compile := true
)
.dependsOn(root)
lazy val plugin = (project in file("plugin"))
.settings(
commonSettings,
scalacOptions += "-J-Xss256m",
name := "plugin",
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
)
.dependsOn(root)
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
Here is how I am writing out to a file:
def printFile(funcBody: String) : Unit = {
// open a file to be written
val file = new File("test.v")
val bw = new BufferedWriter(new FileWriter(file))
bw.write(funcBody + "\n")
//close the output file
bw.close()
}
I want to use a library cloned from github in my machine and modified.
And I would like to test my code.
What can i do to set in my build.sbt file
name := "Actoverse Demo"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.20"
)
lazy val root = project.in(file(".")).dependsOn(actoversePlugin)
lazy val actoversePlugin = RootProject(file ( " /Users/USERNAME/Desktop/Bo/Actoverse-Scala/src/main/scala/actoverse" ))
to execute instead my local version of library ?
You can modify this
RootProject (uri ("https://github.com/45deg/Actoverse-Scala.git")
for this
RootProject (file("whateverPath")
More info here
I need to install some dependencies in my simplest scala project (I am passing some tutorial) and one of them is from github. My build.sbt looks like this:
import sbt._
lazy val root = Project("root", file("."))
.dependsOn(smile)
.settings(
name := "Xyclade ML practical examples",
version := "1.0",
scalaVersion := "2.10.6",
sbtVersion := "0.13.9",
libraryDependencies += "org.scala-lang" % "scala-swing" % "2.10.2"
)
lazy val smile = ProjectRef(uri("https://github.com/haifengl/smile.git#master"), "root")
Maybe, I am missing some basic scala/sbt knowledge (I am a complete noob), but:
1) import com.github.haifengl._ fails with object github is not a member of package com
2) import smile._ leads to error not found: object smile
And as far as I found out, the library package should be called something like com.github.haifengl: https://github.com/haifengl/smile/search?utf8=%E2%9C%93&q=com.github.haifengl&type=Code
Are you sure package com.github.haifengl is in github project you've mentioned? Could it be in some of it's dependencies?
You should not add ProjectRef to github project, instead you'd better add it to dependencies:
"com.github.haifengl" % "smile-core" % "1.0.4"
Like following:
import sbt._
lazy val root = Project("root", file("."))
.settings(
name := "Xyclade ML practical examples",
version := "1.0",
scalaVersion := "2.10.6",
sbtVersion := "0.13.9",
libraryDependencies += Seq(
"org.scala-lang" % "scala-swing" % "2.10.2",
"com.github.haifengl" % "smile-core" % "1.0.4"
)
I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}