I want to validate a downloaded Json file from server during build time and failed the build, if there are any errors.
Is it possible to parse/validate Json in build.sbt?
Your build.sbt is scala code so it can do everything you can do with other scala code.
You should be able to add dependencies (e.g. a json parsing library) of your build.sbt code in project/build.sbt since sbt is recursive.
Here is an example to supplement Jasper-M's answer.
For example, add liahoy's requests-scala HTTP client library, and upickle JSON deserialisation library to project/builds.sbt
libraryDependencies ++= List(
"com.lihaoyi" %% "requests" % "0.6.0",
"com.lihaoyi" %% "upickle" % "1.1.0"
)
Then under project/Preconditions.scala add the following object which will contain assertions you want to check before running the build
object Preconditions {
import scala.util.Try
import requests._
import upickle.default._
case class User(login: String, id: Int)
implicit val userRW: ReadWriter[User] = macroRW
def validateUserJson() = {
val result = Try(read[User](get("https://api.github.com/users/lihaoyi").text)).isSuccess
assert(result, "User JSON should be valid")
}
}
Now these facilities will be available to build.sbt under the root project. Lets create a task in build.sbt to run the assertions
lazy val checkPreconditions = taskKey[Unit]("Validate pre-conditions before building")
checkPreconditions := {
Preconditions.validateUserJson()
println("All preconditions passed!")
}
and finally lets make compile task dependant on checkPreconditions task using dependsOn like so
Compile / compile := (Compile / compile).dependsOn(checkPreconditions).value
Now executing sbt compile should check pre-conditions before proceeding with compilation.
Related
My Scala project has a libraryDependency on slf4j because I use the API for logging. I also want to see the logging output while running from sbt or IntelliJ, both for the Apps that runMain and the unit tests that testOnly from sbt. Therefore there is also a libraryDependency on logback-classic. However, I do not want that second dependency published because of the convention stated below. When someone uses my published library, the transitive dependency should not be automatically brought in. How should that be done? I don't want to explain to the user how to manually exclude the transitive dependency, because they might be using any number of different tools. The logback-classic should continue to be included in an assembled jar, however, if at all possible. It doesn't seem like exclude() is the answer.
"Embedded components such as libraries or frameworks should not declare a dependency on any SLF4J binding/provider [like logback-classic] but only depend on slf4j-api. When a library declares a transitive dependency on a specific binding, that binding is imposed on the end-user negating the purpose of SLF4J. Note that declaring a non-transitive dependency on a binding, for example for testing, does not affect the end-user."
Publish the jar with slf4j-api but use the sbt Test configuration for logback. Unit tests will then have a concrete implementation but it won't be packaged in your artifact.
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
This would be a project with sub-projects. Your sample app uses a concrete implementation, but not the library. Anyone using the library would provide their own.
lazy val root = (project in file("."))
.settings(
publish / skip := true,
)
.aggregate(sampleApp, theLibrary)
lazy val sampleApp = project
.settings(
publish / skip := true,
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % "1.2.11"
)
)
.dependsOn(theLibrary % "test->test;compile->compile")
lazy val theLibrary = project
.settings(
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
)
My tentative solution is to add this code to an sbt file
ThisBuild / pomPostProcess := {
val logback = DependencyId("ch.qos.logback", "logback-classic")
val rule = DependencyFilter { dependencyId =>
dependencyId != logback
}
(node: Node) => new RuleTransformer(rule).transform(node).head
}
and back it up with this Scala code in the project directory
package org.clulab.sbt
import scala.xml.Node
import scala.xml.NodeSeq
import scala.xml.transform.RewriteRule
case class DependencyId(groupId: String, artifactId: String)
abstract class DependencyTransformer extends RewriteRule {
override def transform(node: Node): NodeSeq = {
val name = node.nameToString(new StringBuilder()).toString()
name match {
case "dependency" =>
val groupId = (node \ "groupId").text.trim
val artifactId = (node \ "artifactId").text.trim
transform(node, DependencyId(groupId, artifactId))
case _ => node
}
}
def transform(node: Node, dependencyId: DependencyId): NodeSeq
}
class DependencyFilter(filter: DependencyId => Boolean) extends DependencyTransformer {
def transform(node: Node, dependencyId: DependencyId): NodeSeq =
if (filter(dependencyId)) node
else Nil
}
object DependencyFilter {
def apply(filter: DependencyId => Boolean): DependencyFilter = new DependencyFilter(filter)
}
I'm still hoping to find a similar solution for editing ivy.xml.
I'm trying to use pureConfig and configFactory for my spark application configuration.
here is my code:
import pureconfig.{loadConfigOrThrow}
object Source{
def apply(keyName: String, configArguments: Config): Source = {
keyName.toLowerCase match {
case "mysql" =>
val properties = loadConfigOrThrow[DBConnectionProperties](configArguments)
new MysqlSource(None, properties)
case "files" =>
val properties = loadConfigOrThrow[FilesSourceProperties](configArguments)
new Files(properties)
case _ => throw new NoSuchElementException(s"Unknown Source ${keyName.toLowerCase}")
}
}
}
import Source
val config = ConfigFactory.parseString(result.mkString("\n"))
val source = Source("mysql",config.getConfig("source.mysql"))
when I run it from the IDE (intelliJ) or directly from java
(i.e java jar...) it works fine.
But when I run it with spark-submit it fails with the following error:
Exception in thread "main" java.lang.NoSuchMethodError: shapeless.Witness$.mkWitness(Ljava/lang/Object;)Lshapeless/Witness;
From a quick search I found a similar similar to this question.
which suggest the reason for this is due to the fact both spark and pureConfig depends on Shapeless but with different versions,
I tried to shade it as suggested in the answer
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("shapeless.**" -> "shadeshapless.#1")
.inLibrary("com.github.pureconfig" %% "pureconfig" % "0.7.0").inProject
)
but it didn't work as well
can it be from a different reason?
any idea what may work?
Thanks
You also have to shade shapeless inside its own JAR, in addition to pureconfig:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("shapeless.**" -> "shadeshapless.#1")
.inLibrary("com.chuusai" % "shapeless_2.11" % "2.3.2")
.inLibrary("com.github.pureconfig" %% "pureconfig" % "0.7.0")
.inProject
)
Make sure to add the correct shapeless version.
What is the simplest way to generate Verilog code from existing Chisel code?
Would i have to create my own build file?
For example from a standalone scala file (AND.scala) like the following one..
import Chisel._
class AND extends Module {
val io = IO(new Bundle {
val a = Bool(INPUT)
val b = Bool(INPUT)
val out = Bool(OUTPUT)
})
io.out := io.a & io.b
}
I have the complete Chisel3 Toolchain installed under ubuntu 16.4.
See answer here: Is there a simple example of how to generate verilog from Chisel3 module?
In short, create a build.sbt file at the root of your project with the following in it:
scalaVersion := "2.12.13"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.4.4"
Add this code to AND.scala
object ANDDriver extends App {
(new chisel3.stage.ChiselStage).emitVerilog(new AND, args)
}
Type sbt run on the command line at the root of your project.
I'm trying to add gulp to my Play application, I have created a PlayRunHook object that should allow me to trigger the gulp command, but when I do sbt run I'm getting an error saying it could not find the object. Here is my hook:
package hooks
object Gulp extends CommandHook {
override def beforeStarted(): Unit = {
exec("gulp")
}
}
And then in build.sbt:
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
scalacOptions ++= Seq("-deprecation")
libraryDependencies ++= Seq(
"junit" % "junit" % "4.10" % "test",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.14",
"com.typesafe.play" %% "play" % "2.5.0",
"com.typesafe.play" %% "play-netty-server" % "2.5.0"
)
PlayKeys.playRunHooks += hooks.Gulp()
But I get:
build.sbt:18: error: not found: value hooks
PlayKeys.playRunHooks += hooks.Gulp()
You have to create Gulp object in project/ directory like below:
import play.sbt.PlayRunHook
import sbt._
object Gulp {
def apply(base: File): PlayRunHook = {
object GulpProcess extends PlayRunHook {
override def beforeStarted(): Unit = {
Process("gulp", base).run
}
}
GulpProcess
}
}
Then in your build.sbt:
PlayKeys.playRunHooks += Gulp(baseDirectory.value)
For more details check this guide.
Edit: I actually went with the accepted answer in the end, this alternative answer will trigger your gulp bundle command every time a request is made to your app. By using the PlayRunHook approach, you get different lifecycle hooks to hook into, and in the end I hooked into beforeStarted (as done in the accepted answer) to trigger gulp watch when starting the app.
Down the line I may alter this to do gulp watch in development and gulp build when deploying, though I am not at that stage yet.
The accepted answer explains why build.sbt cannot find the value hooks (the code needs to be in the directory project/), however to solve my issue of wanting to run gulp bundle during compile, I went with this solution:
import sbt._
// ...
lazy val gulp = taskKey[Unit]("Build frontend assets")
gulp := {
println("gulp bundle" !!)
}
compile in Compile <<= (compile in Compile).dependsOn(gulp)
I want to create an set task which creates a database schema with slick. For that, I have a task object like the following in my project:
object CreateSchema {
val instance = Database.forConfig("localDb")
def main(args: Array[String]) {
val createFuture = instance.run(createActions)
...
Await.ready(createFuture, Duration.Inf)
}
}
and in my build.sbt I define a task:
lazy val createSchema = taskKey[Unit]("CREATE database schema")
fullRunTask(createSchema, Runtime, "sbt.CreateSchema")
which gets executed as expected when I run sbt createSchema from the command line.
However, the problem is that application.conf doesn't seem to get taken into account (I've also tried different scopes like Compile or Test). As a result, the task fails due to com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'localDb'.
How can I fix this so the configuration is available?
I found a lot of questions here that deal with using the application.conf inside the build.sbt itself, but that is not what I need.
I have setup a little demo using SBT 0.13.8 and Slick 3.0.0, which is working as expected. (And even without modifying "-Dconfig.resource".)
Files
./build.sbt
name := "SO_20150915"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.0" withSources() withJavadoc(),
"com.typesafe.slick" %% "slick" % "3.0.0",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"com.h2database" % "h2" % "1.3.175"
)
lazy val createSchema = taskKey[Unit]("CREATE database schema")
fullRunTask(createSchema, Runtime, "somefun.CallMe")
./project/build.properties
sbt.version = 0.13.8
./src/main/resources/reference.conf
hello {
world = "buuh."
}
h2mem1 = {
url = "jdbc:h2:mem:test1"
driver = org.h2.Driver
connectionPool = disabled
keepAliveConnection = true
}
./src/main/scala/somefun/CallMe.scala
package somefun
import com.typesafe.config.Config
import com.typesafe.config.ConfigFactory
import slick.driver.H2Driver.api._
/**
* SO_20150915
* Created by martin on 15.09.15.
*/
object CallMe {
def main(args: Array[String]) : Unit = {
println("Hello")
val settings = new Settings()
println(s"Settings read from hello.world: ${settings.hw}" )
val db = Database.forConfig("h2mem1")
try {
// ...
println("Do something with your database.")
} finally db.close
}
}
class Settings(val config: Config) {
// This verifies that the Config is sane and has our
// reference config. Importantly, we specify the "di3"
// path so we only validate settings that belong to this
// library. Otherwise, we might throw mistaken errors about
// settings we know nothing about.
config.checkValid(ConfigFactory.defaultReference(), "hello")
// This uses the standard default Config, if none is provided,
// which simplifies apps willing to use the defaults
def this() {
this(ConfigFactory.load())
}
val hw = config.getString("hello.world")
}
Result
If running sbt createSchema from Console I obtain the output
[info] Loading project definition from /home/.../SO_20150915/project
[info] Set current project to SO_20150915 (in build file:/home/.../SO_20150915/)
[info] Running somefun.CallMe
Hello
Settings read from hello.world: buuh.
Do something with your database.
[success] Total time: 1 s, completed 15.09.2015 10:42:20
Ideas
Please verify that this unmodified demo project also works for you.
Then try changing SBT version in the demo project and see if that changes something.
Alternatively, recheck your project setup and try to use a higher version of SBT.
Answer
So, even if your code resides in your src-folder, it is called from within SBT. That means, you are trying to load your application.conf from within the classpath context of SBT.
Slick uses Typesafe Config internally. (So the approach below (described in background) is not applicable, as you can not modify the Config loading mechanism itself).
Instead try the set the path to your application.conf explicitly via config.resource, see typesafe config docu (search for config.resource)
Option 1
Either set config.resource (via -Dconfig.resource=...) before starting sbt
Option 2
Or from within build.sbt as Scala code
sys.props("config.resource") = "./src/main/resources/application.conf"
Option 3
Or create a Task in SBT via
lazy val configPath = TaskKey[Unit]("configPath", "Set path for application.conf")
and add
configPath := Def.task {
sys.props("config.resource") = "./src/main/resources/application.conf"
}
to your sequence of settings.
Please let me know, if that worked.
Background information
Recently, I was writing a custom plugin for SBT, where I also tried to access a reference.conf as well. Unfortunately, I was not able to access any of .conf placed within project-subfolder using the default ClassLoader.
In the end I created a testenvironment.conf in project folder and used the following code to load the (typesafe) config:
def getConfig: Config = {
val classLoader = new java.net.URLClassLoader( Array( new File("./project/").toURI.toURL ) )
ConfigFactory.load(classLoader, "testenvironment")
}
or for loading a genereal application.conf from ./src/main/resources:
def getConfig: Config = {
val classLoader = new java.net.URLClassLoader( Array( new File("./src/main/resources/").toURI.toURL ) )
// no .conf basename given, so look for reference.conf and application.conf
// use specific classLoader
ConfigFactory.load(classLoader)
}