i am using the Play Java Starter Example 2.5.x from the Play Framework page this is my plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "3.0.0")
this is my build.sbt
name := """play-java"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava,PlayEbean)
scalaVersion := "2.11.11"
libraryDependencies += filters
libraryDependencies ++= Seq(
javaJdbc,
cache,
javaWs,
evolutions
)
In application.conf:
ebean.default = ["models.*"]
when trying to run the application i always get:
error: not found: value PlayEbean lazy val root = (project in file(".")).enablePlugins(PlayJava, PlayEbean) ^ sbt.compiler.EvalException: Type error in expression [error] sbt.compiler.EvalException: Type error in expression Invalid response.
Help is very appreciated.
After researching i solved it the problem was that i did not put the
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "3.0.0")
in the correct file.
Related
There is no caliban.federation for scala 3 yet.
My question is what is a correct way to use it along with scala 3 libraries?
For now I have such a dependencies in my build.sbt:
lazy val `bookings` =
project
.in(file("."))
.settings(
scalaVersion := "3.0.1",
name := "bookings"
)
.settings(commonSettings)
.settings(dependencies)
lazy val dependencies = Seq(
libraryDependencies ++= Seq(
"com.github.ghostdogpr" %% "caliban-zio-http" % "1.1.0"
),
libraryDependencies ++= Seq(
org.scalatest.scalatest,
org.scalatestplus.`scalacheck-1-15`,
).map(_ % Test),
libraryDependencies +=
("com.github.ghostdogpr" %% "caliban-federation" % "1.1.0")
.cross(CrossVersion.for3Use2_13)
But when I'm trying to build it, it's erroring:
[error] (update) Conflicting cross-version suffixes in:
dev.zio:zio-query,
org.scala-lang.modules:scala-collection-compat,
dev.zio:zio-stacktracer,
dev.zio:izumi-reflect,
com.github.ghostdogpr:caliban-macros,
dev.zio:izumi-reflect-thirdparty-boopickle-shaded,
dev.zio:zio,
com.github.ghostdogpr:caliban,
dev.zio:zio-streams
I wrote macro that is working after compilation, but the problem is that Intellij IDEA doesn't see my generated code and some red lines appear. So i found explanation here, that I need to write IDEA plugin that will allow IDEA recognize my generated code. The problem is that i cannot use SyntheticMembersInjector because of missing dependency. Is it possible to write IDEA plugin for my own scala macros?
my plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.2")
addSbtPlugin("org.jetbrains" % "sbt-idea-plugin" % "3.8.4")
my build.sbt:
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport.dockerExposedPorts
import sbt.Keys.{scalacOptions, version}
lazy val coreProject = (project in file("."))
.enablePlugins(JavaAppPackaging, DockerPlugin, AkkaGrpcPlugin)
.settings(
scalaVersion := "2.12.12",
name := "CDMS",
version := "0.1",
libraryDependencies ++= BuildConfig.projectDependencies,
dockerBaseImage := "adoptopenjdk/openjdk15:alpine",
dockerExposedPorts += 9002
)
.dependsOn(validationProject)
lazy val validationProject = (project in file("validation"))
.enablePlugins(SbtPlugin)
.settings(
scalaVersion := "2.12.12",
sbtPlugin := true,
libraryDependencies ++= BuildConfig.monocleDependencies
)
I'm trying to build a Docker image using sbt-native-packager with the following build.sbt (trying to publish the image to a local repository)
val sparkVersion = "2.4.5"
scalaVersion in ThisBuild := "2.12.0"
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
// JAR build settings
lazy val commonSettings = Seq(
organization := "dzlab",
version := "0.1",
scalaSource in Compile := baseDirectory.value / "src",
scalaSource in Test := baseDirectory.value / "test",
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javacOptions ++= Seq(),
scalacOptions ++= Seq(
"-deprecation",
"-feature",
"-language:implicitConversions",
"-language:postfixOps"
),
libraryDependencies ++= sparkLibs
)
// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion
lazy val root = (project in file("."))
.enablePlugins(
DockerPlugin,
JavaAppPackaging
)
.settings(
name := "spark-k8s",
commonSettings,
dockerAliases ++= Seq(
dockerAlias.value.withRegistryHost(Some("localhost:5000"))
),
mainClass in (Compile, run) := Some("dzlab.SparkJob")
)
SBT and the packager versions
$ cat project/plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
$ cat project/build.properties
sbt.version=0.13.18
When I try to run the packager
$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
dockerAliases ++= Seq(
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
It does not recognize dockerAliases not sure why as it is part of the publishing settings.
What is the proper way to set the Docker registry??
Your sbt-native-packager version is hopelessly outdated, as is your sbt version. That SettingKey doesn't exist in that version.
Compare: sbt-native-packager 1.0 vs. sbt-native-packager 1.7.4
I have build.sbt file:
import sbt.Keys.libraryDependencies
lazy val scalatestVersion = "3.0.4"
lazy val scalaMockTestSupportVersion = "3.6.0"
lazy val typeSafeConfVersion = "1.3.2"
lazy val scalaLoggingVersion = "3.7.2"
lazy val logbackClassicVersion = "1.2.3"
lazy val commonSettings = Seq(
organization := "com.stulsoft",
version := "0.0.1",
scalaVersion := "2.12.4",
scalacOptions ++= Seq(
"-feature",
"-language:implicitConversions",
"-language:postfixOps"),
libraryDependencies ++= Seq(
"com.typesafe.scala-logging" %% "scala-logging" % scalaLoggingVersion,
"ch.qos.logback" % "logback-classic" % logbackClassicVersion,
"com.typesafe" % "config" % typeSafeConfVersion,
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"org.scalamock" %% "scalamock-scalatest-support" % scalaMockTestSupportVersion % "test"
)
)
unmanagedJars in Compile += file("lib/opencv-331.jar")
lazy val pimage = project.in(file("."))
.settings(commonSettings)
.settings(
name := "pimage"
)
parallelExecution in Test := true
It is working fine, if I use sbt run, but I cannot run from IntelliJ.
I receive error:
java.lang.UnsatisfiedLinkError: no opencv_java331 in java.library.path
I can add manually (File->Project Structure->Libraries->+ necessary dir).
My question is: is it possible to specify build.sbt that it will automatically create IntelliJ project with specified library?
I would say try to: drag and drop the dependency into the /lib which should be in the root directory of your project, if it's not there create it.
Run commands:
sbt reload
sbt update
Lastly you could try something like:
File -> Project Structure -> Modules -> then mark all the modules usually 1 to 3, delete them (don't worry won't delete your files) -> hit the green plus sign and select Import Module -> select root directory of your project and it should then refresh it
If none of these help, I'm out of ideas.
I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}