I need to install some dependencies in my simplest scala project (I am passing some tutorial) and one of them is from github. My build.sbt looks like this:
import sbt._
lazy val root = Project("root", file("."))
.dependsOn(smile)
.settings(
name := "Xyclade ML practical examples",
version := "1.0",
scalaVersion := "2.10.6",
sbtVersion := "0.13.9",
libraryDependencies += "org.scala-lang" % "scala-swing" % "2.10.2"
)
lazy val smile = ProjectRef(uri("https://github.com/haifengl/smile.git#master"), "root")
Maybe, I am missing some basic scala/sbt knowledge (I am a complete noob), but:
1) import com.github.haifengl._ fails with object github is not a member of package com
2) import smile._ leads to error not found: object smile
And as far as I found out, the library package should be called something like com.github.haifengl: https://github.com/haifengl/smile/search?utf8=%E2%9C%93&q=com.github.haifengl&type=Code
Are you sure package com.github.haifengl is in github project you've mentioned? Could it be in some of it's dependencies?
You should not add ProjectRef to github project, instead you'd better add it to dependencies:
"com.github.haifengl" % "smile-core" % "1.0.4"
Like following:
import sbt._
lazy val root = Project("root", file("."))
.settings(
name := "Xyclade ML practical examples",
version := "1.0",
scalaVersion := "2.10.6",
sbtVersion := "0.13.9",
libraryDependencies += Seq(
"org.scala-lang" % "scala-swing" % "2.10.2",
"com.github.haifengl" % "smile-core" % "1.0.4"
)
Related
I am running scala on a script file and I installed scalatest in a local file
name := """codewars"""
organization := "com.codewars"
version := "1.0"
// lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.13.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test
But when I try to run the script via
scala ./mumbling.scala
I get this error
./mumbling.scala:1: error: object scalatest is not a member of package org
import org.scalatest.FunSpec
I wrote macro that is working after compilation, but the problem is that Intellij IDEA doesn't see my generated code and some red lines appear. So i found explanation here, that I need to write IDEA plugin that will allow IDEA recognize my generated code. The problem is that i cannot use SyntheticMembersInjector because of missing dependency. Is it possible to write IDEA plugin for my own scala macros?
my plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.2")
addSbtPlugin("org.jetbrains" % "sbt-idea-plugin" % "3.8.4")
my build.sbt:
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport.dockerExposedPorts
import sbt.Keys.{scalacOptions, version}
lazy val coreProject = (project in file("."))
.enablePlugins(JavaAppPackaging, DockerPlugin, AkkaGrpcPlugin)
.settings(
scalaVersion := "2.12.12",
name := "CDMS",
version := "0.1",
libraryDependencies ++= BuildConfig.projectDependencies,
dockerBaseImage := "adoptopenjdk/openjdk15:alpine",
dockerExposedPorts += 9002
)
.dependsOn(validationProject)
lazy val validationProject = (project in file("validation"))
.enablePlugins(SbtPlugin)
.settings(
scalaVersion := "2.12.12",
sbtPlugin := true,
libraryDependencies ++= BuildConfig.monocleDependencies
)
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
I am looking for something similar to dependencies tag in pom.xml with Maven build.
Will it make a difference if I use separate build.sbt for each of the child modules?
Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings file for the dependencies.
Using provided->provided in the dependsOn helped me solve a similar problem:
So something like:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
I am trying to compile the following code in SBT as part of a subproject.
package bitstream.compiler
package eval
import scala.reflect.runtime.universe._
import scala.reflect.runtime.currentMirror
import scala.tools.reflect.ToolBox
// Based on code from:
// https://gist.github.com/xuwei-k/9ba39fe22f120cb098f4
object Eval {
def apply[A](tree: Tree): A = {
val toolbox = currentMirror.mkToolBox()
toolbox.eval(tree).asInstanceOf[A]
}
}
Here is my build.sbt:
lazy val commonSettings = Seq(
organization := "com.bitbucket.example-project",
scalaVersion := "2.12.6"
)
lazy val root = (project in file("."))
.settings(
commonSettings,
version := "0.1.0-SNAPSHOT",
name := "example-project"
)
lazy val plugin = (project in file("plugin"))
.settings(
commonSettings,
scalacOptions += "-J-Xss256m",
name := "plugin",
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
)
.dependsOn(root)
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
I try to compile the plugin subproject using plugin/package, and I get the error object tools is not a member of package scala. As far as I know, scala.tools should be provided by the scala-compiler dependency. Is there something I am missing?
scala.tools.reflect.ToolBox is in scala-compiler.jar. Try libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value. Sbt does not assume that you will use classes in scala-compiler.jar directly. - this is documented in https://www.scala-sbt.org/1.0/docs/Configuring-Scala.html
I would like to import Maven libraries either with the Maven's XML file or SBT's Scala file. I guess there already are the same questions out, but I could't quite find any. Thank you!
You just treat remote Maven repositories normally. Unless you want to utilize your local .m2/repository. See below for an example Build.scala using both:
object myBuild extends Build {
lazy val mainProject = Project(
id="root",
base=file("."),
settings = Project.defaultSettings ++ Seq(
name := "Root project",
scalaVersion := "2.11.4",
version := "0.1",
resolvers ++= Seq(remoteMavenRepo, localMavenRepo),
libraryDependencies ++= List(
mavenLibrary1, mavenLibrary2
)
)
)
val remoteMavenRepo = "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
val localMavenRepo = "Local Maven" at Path.userHome.asFile.toURI.toURL + ".m2/repository"
// if library folows scala version suffix convention, then we use %%
val mavenLibrary1 = "com.typesafe.slick" %% "slick" % "2.0.2"
// if it's a java library with no scala version suffix, then we use %
val mavenLibrary2 = "joda-time" % "joda-time" % "2.4"