Using jooq-sbt-plugin with ojdbc - scala

I'm using jOOQ. Solving an issue with jooq-sbt-plugin config (here) resulted in a classpath issue which I think is separate from the original one. I managed to get the configuration to work but trying to get it play with Oracle drivers seems impossible.
The issue here is that the plugin seems run its own java process and hence the required classpath (with odbc14.jar in it) is never passed on. Is there any way to get the plugin work? I couldn't figure out how to inject to the plugin's classpath.
The only workaround I can come up with is by defining a task instead (described here: https://gist.github.com/chris-martin/5140754).
Any help is greatly appreciated. Thanks.
2013-09-10 Update
Here's the config:
import sbt._
import Keys._
import JOOQPlugin._
object SampleBuild extends Build {
lazy val all = Project(id = "all", base = file("."), settings = defaultSettings) aggregate(
one, two
)
lazy val one = Project(
id = "one",
base = file("one"),
settings = defaultSettings ++ Seq(
libraryDependencies ++= Dependencies.one
)
)
lazy val two = Project(
id = "two",
base = file("two"),
settings = defaultSettings ++ jooqSettings ++ customJooqSettings ++ Seq(
libraryDependencies ++= Dependencies.two
)
) dependsOn (one)
override lazy val settings = super.settings ++ buildSettings
lazy val buildSettings = Seq(
organization := "org.sample",
version := "0.1-SNAPSHOT",
scalaVersion := "2.10.2"
)
lazy val defaultSettings = Defaults.defaultSettings ++ Seq(
scalacOptions in Compile ++= scalacParams,
externalResolvers in Compile := Resolvers.commonResolvers,
shellPrompt := ShellPrompt.buildShellPrompt,
resolvers ++= Resolvers.commonResolvers
)
lazy val customJooqSettings = Seq(
jooqOptions := jooqBvpOptions,
jooqOutputDirectory := new java.io.File("../src/appdb/src/main/java")
)
lazy val jooqBvpOptions = Seq(
"jdbc.driver" -> "oracle.jdbc.OracleDriver",
"jdbc.url" -> "jdbc:oracle:thin:#//<some server>",
"jdbc.user" -> "<some user>",
"jdbc.password" -> "<some pwd>",
"generator.database.name" -> "org.jooq.util.oracle.OracleDatabase",
"generator.database.inputSchema" -> "<some schema>",
"generator.database.includes" -> "table1|table2|table3",
"generator.target.packageName" -> "org.example.generated")
}
object Resolvers { /* ... */ }
object Dependencies { /* ... */ }
object ShellPrompt { /* ... */ }
And here's the error:
[info] Initialising properties : /jooq-config2705409947508036761.xml
[error] Cannot read /jooq-config2705409947508036761.xml. Error : oracle.jdbc.OracleDriver
[error] java.lang.ClassNotFoundException: oracle.jdbc.OracleDriver
[error] at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
[error] at java.security.AccessController.doPrivileged(Native Method)
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
[error] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[error] at java.lang.Class.forName0(Native Method)
[error] at java.lang.Class.forName(Class.java:171)
[error] at org.jooq.util.GenerationTool.main(GenerationTool.java:269)
[error] at org.jooq.util.GenerationTool.main(GenerationTool.java:123)
[error] Usage : GenerationTool <configuration-file>
[trace] Stack trace suppressed: run last appdb-tool/jooq:codegen for the full output.
[error] (appdb-tool/jooq:codegen) Failed with return code: 255
[error] Total time: 1 s, completed Sep 10, 2013 1:45:24 PM

jooq-sbt-plugin's readme says:
Add your database driver to your list of libraryDependencies with "jooq" scope:
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.22" % "jooq"
You left out Dependencies.two in the above excerpt, but maybe you're missing that.

Related

sbt.librarymanagement.ResolveException: Error downloading in sbt project with sbt plugin module

I'm having the following problem. I have a multimodule project described in the following way.
import Build._
import sbt.Keys.scalacOptions
lazy val moduleA =
module(id = "module-a", "module-a")
lazy val moduleB =
module(id = "module-b", "module-b")
lazy val root =
module(id = "sample-project", directory = ".")
.aggregate(moduleA, moduleB)
lazy val plugin = Project(id = "plugin", base = file("plugin"))
.settings(
sbtPlugin := true,
name := "MyPlugin"
)
.dependsOn(moduleA)
Where Build declares the following helpers
object Build {
val scala212 = "2.12.11"
val scala213 = "2.13.3"
val projectScalaVersion = scala213
val supportedScalaVersions = List(scala213, scala212)
val projectVersion = "0.3.5-SNAPSHOT"
val projectOrganization = "com.example"
val commonSettings = Seq(
version := projectVersion,
crossScalaVersions := supportedScalaVersions,
organization := projectOrganization,
scalaVersion := projectScalaVersion,
scalacOptions += "-deprecation",
scalafmtOnCompile := true
)
def module(id: String, directory: String): Project = {
Project(id = id, base = file(directory))
.settings(commonSettings: _*)
}
implicit class ProjectOps(project: Project) {
def libraries(modules: ModuleID*): Project = {
project.settings(libraryDependencies ++= modules)
}
def disablePublish: Project = {
project.settings(publishLocal := {}, publishM2 := {}, publish := {})
}
}
}
I'm trying to add an sbt plugin to expose some of the parts of the project using tasks.
Unfortunately when compiling the plugin I get the following error:
sbt:sample-project> plugin/compile
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last plugin / update for the full output
[error] (plugin / update) sbt.librarymanagement.ResolveException: Error downloading com.example:module-a_2.12:0.3.5-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: https://repo1.maven.org/maven2/com/example/module-a_2.12/0.3.5-SNAPSHOT/module-a_2.12-0.3.5-SNAPSHOT.pom
[error] not found: /Users/ltrojanowski/.ivy2/local/com.example/module-a_2.12/0.3.5-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.example/module-a_2.12/0.3.5-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo.typesafe.com/typesafe/ivy-releases/com.example/module-a_2.12/0.3.5-SNAPSHOT/ivys/ivy.xml
[error] Total time: 3 s, completed Aug 26, 2020 1:35:20 PM
I don't know why this is happening. Any help or tip in how to fix this would be much appreciated.

sbt - object apache is not a member of package org

I want to deploy and submit a spark program using sbt but its throwing error.
Code:
package in.goai.spark
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("First Spark")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
build.sbt
name := "First Spark"
version := "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += Resolver.mavenLocal
Under first/project directory
build.properties
bt.version=0.13.9
When I am trying to run sbt package its throwing error given below.
[root#hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("First Spark")
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error] val sc = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM
I have tried with extends to App too but no change.
Please remove resolvers += Resolver.mavenLocal from build.sbt. Since spark-core is available on Maven, we don't need to use local resolvers.
After that, you can try sbt clean package.

Adding SBT as a dependency in SBT file

I am writing few sbt tasks in a scala file. These SBT tasks will be imported into many other projects.
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "com.example",
scalaVersion := "2.11.8",
version := "1.0.0"
)),
name := "sbttasks",
libraryDependencies ++= Seq(
"org.scala-sbt" % "sbt" % "1.0.0" % "provided"
)
)
I get a compilation error
error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-parser-combinators
[error] at scala.sys.package$.error(package.scala:27)
[error] at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error] at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error] at sbt.Classpaths$.$anonfun$ivyBaseSettings$64(Defaults.scala:1995)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:271)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] at sbt.Completion
I don't want to write the custom tasks in build.sbt itself (as the SBT documentation shows) because then I won't be able to import my custom tasks into other projects.
To write reusable tasks that you can "import" in different projects, you need to make an sbt plugin.
If you have a multi-project build and want to reuse your tasks in the subprojects, you can create a file project/MyPlugin.scala with
import sbt._
import sbt.Keys._
object MyPlugin extends AutoPlugin {
override def trigger = noTrigger
object autoImport {
val fooTask = taskKey[Foo]("Foo description")
val barTask = taskKey[Bar]("Bar description")
}
import autoImport._
override lazy val projectSettings = Seq(
fooTask := { ??? },
barTask := { ??? }
)
}
Then to enable this plugin (i.e. make those tasks available) in a subproject, you can write this in your build.sbt:
lazy val subproject = (project in file("subproject"))
.enablePlugins(MyPlugin)
On the contrast, if you want to reuse these tasks in other unrelated projects, you need to make this plugin a separate project and publish it. It's a normal sbt project, but instead of an explicit sbt dependency, you write in its build.sbt:
sbtPlugin := true
And the code defining tasks goes to src/main/scala/ (like in a normal project).
You can read in detail about writing plugins in the sbt documentation.
Change version of "org.scala-sbt" to "1.0.0-M4"
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "com.example",
scalaVersion := "2.11.8",
version := "1.0.0",
name := "sbttasks"
)),
libraryDependencies ++= Seq(
"org.scala-sbt" % "sbt" % "1.0.0-M4" % "provided"
)
)
For entire compatibility matrix check
https://mvnrepository.com/artifact/org.scala-sbt/main

sbt cross project, shared dependencies for test example

I have a small project.
Where I have the following problem:
scalaTest needs to be added to all three dependency project (client, server, shared), otherwise the scalatest library is not accessible from all projects.
In other words, if I write
val jvmDependencies = Def.setting(Seq(
"org.scalaz" %% "scalaz-core" % "7.2.8"
)++scalaTest)
then things work fine.
But if I don't write ++scalaTest into each three dependencies then it fails like this:
> test
[info] Compiling 1 Scala source to /Users/joco/tmp3/server/target/scala-2.11/test-classes...
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:1: object specs2 is not a member of package org
[error] import org.specs2.mutable.Specification
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:3: not found: type Specification
[error] class Test extends Specification {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:5: value should is not a member of String
[error] "Test" should {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:6: value in is not a member of String
[error] "one is one" in {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:8: value === is not a member of Int
[error] 1===one
[error] ^
[error] 5 errors found
[error] (server/test:compileIncremental) Compilation failed
[error] Total time: 4 s, completed Mar 18, 2017 1:56:54 PM
However for production(not test) code everything works just fine: I don't have to add 3 times the same dependencies (in this example autowire) to all three projects if I want to use a library in all three projects, it is enough to add it to only the shared project and then I can use that library from all three projects.
For test code, however, as I mentioned above, currently I have to add the same library dependency (scalaTest - below) to all three projects.
Question: Is there a way to avoid this ?
Settings.scala:
import org.scalajs.sbtplugin.ScalaJSPlugin.autoImport._
import sbt.Keys._
import sbt._
object Settings {
val scalacOptions = Seq(
"-Xlint",
"-unchecked",
"-deprecation",
"-feature",
"-Yrangepos"
)
object versions {
val scala = "2.11.8"
}
val scalaTest=Seq(
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"org.specs2" %% "specs2" % "3.7" % "test")
val sharedDependencies = Def.setting(Seq(
"com.lihaoyi" %%% "autowire" % "0.2.6"
)++scalaTest)
val jvmDependencies = Def.setting(Seq(
"org.scalaz" %% "scalaz-core" % "7.2.8"
))
/** Dependencies only used by the JS project (note the use of %%% instead of %%) */
val scalajsDependencies = Def.setting(Seq(
"org.scala-js" %%% "scalajs-dom" % "0.9.1"
)++scalaTest)
}
build.sbt:
import sbt.Keys._
import sbt.Project.projectToRef
import webscalajs.SourceMappings
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared")) .settings(
scalaVersion := Settings.versions.scala,
libraryDependencies ++= Settings.sharedDependencies.value,
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
) .jsConfigure(_ enablePlugins ScalaJSWeb)
lazy val sharedJVM = shared.jvm.settings(name := "sharedJVM")
lazy val sharedJS = shared.js.settings(name := "sharedJS")
lazy val elideOptions = settingKey[Seq[String]]("Set limit for elidable functions")
lazy val client: Project = (project in file("client"))
.settings(
scalaVersion := Settings.versions.scala,
scalacOptions ++= Settings.scalacOptions,
libraryDependencies ++= Settings.scalajsDependencies.value,
testFrameworks += new TestFramework("utest.runner.Framework")
)
.enablePlugins(ScalaJSPlugin)
.disablePlugins(RevolverPlugin)
.dependsOn(sharedJS)
lazy val clients = Seq(client)
lazy val server = (project in file("server")) .settings(
scalaVersion := Settings.versions.scala,
scalacOptions ++= Settings.scalacOptions,
libraryDependencies ++= Settings.jvmDependencies.value
)
.enablePlugins(SbtLess,SbtWeb)
.aggregate(clients.map(projectToRef): _*)
.dependsOn(sharedJVM)
onLoad in Global := (Command.process("project server", _: State)) compose (onLoad in Global).value
fork in run := true
cancelable in Global := true
For test code, however, as I mentioned above, currently I have to add the same library dependency (scalaTest - below) to all three projects.
That is expected. test dependencies are not inherited along dependency chains. That makes sense, because you don't want to depend on JUnit just because you depend on a library that happens to be tested using JUnit.
Although yes, that calls for a bit of duplication when you have several projects in the same build, all using the same testing framework. This is why we often find some commonSettings that are added to all projects of an sbt build. This is also where we typically put things like organization, scalaVersion, and many other settings that usually apply to all projects inside one build.

Is it possible to reject publish if SNAPSHOT dependencies are used in SBT?

I keep accidentally publishing my internal project still referencing internal SNAPSHOTs, but it would be very helpful if there was an SBT plugin that would fail to publish if you are relying on any SNAPSHOT dependencies. Is anyone aware of such a plugin or feature in SBT?
Here's how you can write such plugin.
output
> publish
[info] :: delivering :: com.example#b_2.10;0.1.0 :: 0.1.0 :: release :: Fri Jan 13 15:50:53 EST 2017
[info] delivering ivy file to /xxx/b/target/scala-2.10/ivy-0.1.0.xml
[info] Wrote /xxx/b/target/scala-2.10/b_2.10-0.1.0.pom
[info] Wrote /xxx/a/target/scala-2.10/a_2.10-0.1.0.pom
[info] :: delivering :: com.example#a_2.10;0.1.0 :: 0.1.0 :: release :: Fri Jan 13 15:50:53 EST 2017
[info] delivering ivy file to /xxx/a/target/scala-2.10/ivy-0.1.0.xml
[trace] Stack trace suppressed: run last b/*:publishConfiguration for the full output.
[trace] Stack trace suppressed: run last a/*:publishConfiguration for the full output.
[error] (b/*:publishConfiguration) SNAPSHOT found in classpath:
[error] com.eed3si9n:treehugger_2.10:0.2.4-SNAPSHOT:compile->default;compile->compile;compile->runtime;compile->default(compile);compile->master
[error] (a/*:publishConfiguration) SNAPSHOT found in classpath:
[error] com.eed3si9n:treehugger_2.10:0.2.4-SNAPSHOT:compile->default;compile->compile;compile->runtime;compile->default(compile);compile->master
[error] com.example:c_2.10:0.1.0-SNAPSHOT:compile->compile;compile->default(compile)
[error] io.netty:netty-all:4.1.8.Final-SNAPSHOT:compile->default;compile->compile;compile->runtime;compile->default(compile);compile->master
[error] Total time: 0 s, completed Jan 13, 2017 3:50:53 PM
project/build.properties
sbt.version = 0.13.13
project/DepsVerifyPlugin.scala
import sbt._
import Keys._
object DepsVerifyPlugin extends sbt.AutoPlugin {
override def requires = plugins.JvmPlugin
override def trigger = allRequirements
override def projectSettings = Seq(
publishConfiguration := {
val old = publishConfiguration.value
val ur = update.value
ur.configuration("compile") foreach { compileReport =>
val allModules = compileReport.allModules
val snapshotDeps = allModules filter { _.revision contains "SNAPSHOT" }
if (snapshotDeps.nonEmpty) {
sys.error(
"SNAPSHOT found in classpath:\n" +
snapshotDeps.mkString("\n")
)
}
}
old
}
)
}
build.sbt
val commonSettings: Seq[Setting[_]] = Seq(
organization in ThisBuild := "com.example",
scalaVersion in ThisBuild := "2.10.6",
version in ThisBuild := "0.1.0",
resolvers += Resolver.sonatypeRepo("public"),
publishTo := Some(Resolver.file("file", new File(Path.userHome.absolutePath+"/test-repo")))
)
val netty = "io.netty" % "netty-all" % "4.1.8.Final-SNAPSHOT"
val treehugger = "com.eed3si9n" %% "treehugger" % "0.2.4-SNAPSHOT"
lazy val root = (project in file("."))
.aggregate(a, b, c)
.settings(
commonSettings,
name := "Hello",
publish := ()
)
lazy val a = (project in file("a"))
.dependsOn(b, c)
.settings(
commonSettings,
libraryDependencies += netty
)
lazy val b = (project in file("b"))
.settings(
commonSettings,
libraryDependencies += treehugger
)
lazy val c = (project in file("c"))
.settings(
commonSettings,
version := "0.1.0-SNAPSHOT",
publish := ()
)
You could consider adopting sbt-release.
This is a more high-level 'workflow' plugin: 'publish' is used as one of the steps in a release (after 'check that there's no SNAPSHOT dependencies').
It will not prevent you from running 'sbt publish', but when you make a habit of using 'sbt release' instead of 'sbt publish' it accomplishes what you're looking for.