ProvisionException: Unable to provision - scala

I got this error while trying to integrate postgresql into my play app:
ProvisionException: Unable to provision, see the following errors:
1) No implementation for play.api.db.slick.DatabaseConfigProvider was bound.
while locating play.api.db.slick.DatabaseConfigProvider
for parameter 0 at models.EntryRepo.<init>(EntryRepo.scala:10)
while locating models.EntryRepo
for parameter 0 at controllers.Entries.<init>(Entries.scala:17)
while locating controllers.Entries for parameter 4 at router.Routes.<init>(Routes.scala:39)
while locating router.Routes
while locating play.api.inject.RoutesProvider while locating play.api.routing.Router
My SBT file:
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
cache,
ws,
specs2 % Test
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-slick" % "1.1.1",
"com.typesafe.play" %% "play-slick-evolutions" % "1.1.1",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc4",
"com.typesafe.slick" %% "slick" % "3.1.1",
"com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
routesGenerator := InjectedRoutesGenerator
My application.conf
slicks.dbs.default.driver="slick.driver.PostgresDriver$"
slicks.dbs.default.dataSourceClass="slick.jdbc.DatabaseUrlDataSource"
slicks.dbs.default.db.default.driver = org.postgresql.Driver
slicks.dbs.default.db.default.url = "jdbc:postgresql://localhost:5000/aleksander"
slicks.dbs.default.db.default.user = postgres
slicks.dbs.default.db.default.password = "lelelel"
The parts with the injection
class EntryRepo #Inject()(protected val dbConfigProvider: DatabaseConfigProvider) {...}
class Entries #Inject()(entryRepo: EntryRepo) extends Controller {...}
I've been following the play-slick3 template from activator. I tried to follow the template as close as possible but the error still persists

I had similiar issue and I was using Play 2.5.x.
In their new application.conf their code convention has changed from simple toggling to more "JSONish" formatting (called HOCON).
db {
foo
}
And you may be tricked into thinking that you should put
slicks.dbs.default.driver="slick.driver.PostgresDriver$"
slicks.dbs.default.dataSourceClass="slick.jdbc.DatabaseUrlDataSource"
slicks.dbs.default.db.default.driver = org.postgresql.Driver
slicks.dbs.default.db.default.url ="jdbc:postgresql://localhost:5000/aleksander"
slicks.dbs.default.db.default.user = postgres
slicks.dbs.default.db.default.password = "lelelel"
inside those db {} braces. However, after pulling my hairs for over 6 hours I found out that putting db configuration code OUTSIDE those braces solves my problem. You may have fallen into similiar trap.
Of course you can use this in jsonic fashion like this:
slicks.dbs.default.db { /* your slick configuration */}

Related

Spark - Error “A master URL must be set in your configuration” using Intellij IDEA

When i am trying to hit the Spark streaming application using Intellij IDEA
Env
Spark core version 2.2.0
Intellij IDEA 2017.3.5 VERSION
Additional info :
Spark is running on Yarn mode.
Getting Error :
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.ExceptionInInitializerError
at kafka_stream.kafka_stream.main(kafka_stream.scala)
Caused by: org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:376)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
at kafka_stream.InitSpark$class.$init$(InitSpark.scala:15)
at kafka_stream.kafka_stream$.<init>(kafka_stream.scala:6)
at kafka_stream.kafka_stream$.<clinit>(kafka_stream.scala)
... 1 more
Process finished with exit code 1
Tried this
val spark: SparkSession = SparkSession.builder()
.appName("SparkStructStream")
.master("spark://127.0.0.1:7077")
//.master("local[*]")
.getOrCreate()
Still getting the same MASTER URL ERROR
Content of build.sbt file
name := "KafkaSpark"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0",
"org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
"org.apache.spark" % "spark-streaming-kafka_2.11" % "1.6.3"
)
// https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.11
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-streams
libraryDependencies += "org.apache.kafka" % "kafka-streams" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/connect-api
libraryDependencies += "org.apache.kafka" % "connect-api" % "0.11.0.0"
libraryDependencies += "com.databricks" %% "spark-avro" % "4.0.0"
resolvers += Resolver.mavenLocal
resolvers += "central maven" at "https://repo1.maven.org/maven2/"
Any help on it would be much appreciated ?
It looks like the parameter is not passed somehow. E.g. the spark is initialized somewhere earlier. Nevertheless you can try with the VM option -Dspark.master=local[*], that pass the parameter to all places where it is not defined, so it should solve your problem. In the IntelliJ it's in list of run config -> Edit Configurations... -> VM Options
Download winutils.exe and place the file in c/hadoop/bin/winutil.exe
Include below line under def main statement
System.setProperty("hadoop.home.dir", "C:\\hadoop")
and it works well.

Build sbt for spark with janusgraph and gremlin scala

I was trying to setup a IntelliJ build for spark with janusgraph using gremlin scala but I am running into errors.
My build.sbt file is:
version := "1.0"
scalaVersion := "2.11.11"
libraryDependencies += "com.michaelpollmeier" % "gremlin-scala" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-mllib
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.2.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.1"
// https://mvnrepository.com/artifact/org.janusgraph/janusgraph-core
libraryDependencies += "org.janusgraph" % "janusgraph-core" % "0.2.0"
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % "1.2.3" % Test,
"org.scalatest" %% "scalatest" % "3.0.3" % Test
)
resolvers ++= Seq(
Resolver.mavenLocal,
"Sonatype OSS" at "https://oss.sonatype.org/content/repositories/public"
)
But I am getting errors when I try to compile code that uses gremlin scala libraries or io.Source libraries. Can someone share their build file or tell what I should modify to fix it.
Thanks in advance.
So, I was trying to compile this code:
import gremlin.scala._
import org.apache.commons.configuration.BaseConfiguration
import org.janusgraph.core.JanusGraphFactory
class Test1() {
val conf = new BaseConfiguration()
conf.setProperty("storage.backend", "inmemory")
val gr = JanusGraphFactory.open(conf)
val graph = gr.asScala()
graph.close
}
object Test{
def main(args: Array[String]) {
val t = new Test1()
println("in Main")
}
}
The errors I get are:
Error:(1, 8) not found: object gremlin
import gremlin.scala._
Error:(10, 18) value asScala is not a member of org.janusgraph.core.JanusGraph
val graph = gr.asScala()
If you go to the Gremlin-Scala GitHub page you'll see that the current version is "3.3.1.1" and that
Typically you just need to add a dependency on "com.michaelpollmeier" %% "gremlin-scala" % "SOME_VERSION" and one for the graph db of your choice to your build.sbt (this readme assumes tinkergraph). The latest version is displayed at the top of this readme in the maven badge.
It is not a surprise that the APi has changed when the major version of the
library is different. If I change your first dependency as
//libraryDependencies += "com.michaelpollmeier" % "gremlin-scala" % "2.3.0" //old!
libraryDependencies += "com.michaelpollmeier" %% "gremlin-scala" % "3.3.1.1"
then your example code compiles for me.

Play2.5 scala slick configuration for mariadb

I'm trying to setup a play2.5 app with slick but i'm getting an error on app startup, any clue how to fix it?
I have created the project from Play Scala Seed activator and I'm trying to add slick to the project, but for now unsuccessfully
The error :
! #72ef93323 - Internal server error, for (GET) [/] ->
play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[AbstractMethodError: play.core.server.netty.NettyModelConversion$$anon$1.copy$default$11()Lscala/Option;]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:269)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:195)
at play.core.server.Server$class.logExceptionAndGetResult$1(Server.scala:46)
at play.core.server.Server$class.getHandlerFor(Server.scala:66)
at play.core.server.NettyServer.getHandlerFor(NettyServer.scala:46)
at play.core.server.netty.PlayRequestHandler.handle(PlayRequestHandler.scala:81)
at play.core.server.netty.PlayRequestHandler.channelRead(PlayRequestHandler.scala:162)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)
at com.typesafe.netty.http.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:131)
Caused by: java.lang.AbstractMethodError: play.core.server.netty.NettyModelConversion$$anon$1.copy$default$11()Lscala/Option;
at play.core.routing.HandlerInvokerFactory$.taggedRequest(HandlerInvoker.scala:86)
at play.core.routing.TaggingInvoker$$anon$2.tagRequest(HandlerInvoker.scala:45)
at play.api.http.DefaultHttpRequestHandler$$anonfun$4.apply(HttpRequestHandler.scala:119)
at play.api.http.DefaultHttpRequestHandler$$anonfun$4.apply(HttpRequestHandler.scala:118)
at scala.Option.map(Option.scala:146)
at play.api.http.DefaultHttpRequestHandler.handlerForRequest(HttpRequestHandler.scala:118)
at play.core.server.Server$class.getHandlerFor(Server.scala:56)
at play.core.server.NettyServer.getHandlerFor(NettyServer.scala:46)
at play.core.server.netty.PlayRequestHandler.handle(PlayRequestHandler.scala:81)
at play.core.server.netty.PlayRequestHandler.channelRead(PlayRequestHandler.scala:162)
the build.sbt
name := "play-scala"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.0-RC1" % Test,
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0",
"org.mariadb.jdbc" % "mariadb-java-client" % "1.5.5",
cache,
ws
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
The application.conf
slick.dbs.default.driver="slick.driver.MySQLDriver$"
slick.dbs.default.db.driver="org.mariadb.jdbc.Driver"
slick.dbs.default.db.url="jdbc:mariadb://localhost:3306/test"
slick.dbs.default.db.user=******
slick.dbs.default.db.password="*******"

Play Framework 2.4 WithApplication call not found in play.api.test

I'm trying to simple test a route in the play framework 2.4 and I follow the guide here: https://www.playframework.com/documentation/2.4.x/ScalaFunctionalTestingWithSpecs2 (testing the router)
here the code
package routesAndController
import org.specs2.mutable._
import org.specs2.runner._
import org.junit.runner._
import play.api.test._
import play.api.test.Helpers._
/**
* Created by root on 3/11/16.
*/
#RunWith(classOf[JUnitRunner])
class AnalysisEntryPointTest extends Specification {
"the AnalysisEntryPoint" should {
"where the route must be /DomoticRoomServer/Analysis with 200" in new WithApplication {
val result = route(FakeRequest(GET, "/domoticRoom/analysis")).get
status(result) must equalTo(OK)
contentType(result) must beSome.which(_ == "text/html")
}
}
}
All pretty straight forward. The problem is that the Class 'WithApplication' is not found in the play.api.test package but in the play.test.
I tried to use the object in the api.test but specs2 give me the error:
[error] /home/benkio/projects/DomoticRoom/Server/test/routesAndController/AnalysisEntryPointTest.scala:19: could not find implicit value for evidence parameter of type org.specs2.execute.AsResult[play.test.WithApplication{val result: scala.concurrent.Future[play.api.mvc.Result]}]
[error] "where the route must be /DomoticRoomServer/Analysis with 200" in new WithApplication() {
[error] ^
[error] one error found
[error] (test:compileIncremental) Compilation failed
any suggestions?
here the build.sbt:
import play.routes.compiler.InjectedRoutesGenerator
import play.sbt.PlayScala
name := """Domotic Room Server"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
resolvers ++= Seq(
"scalaz-bintray" at "http://dl.bintray.com/scalaz/releases",
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"Millhouse Bintray" at "http://dl.bintray.com/themillhousegroup/maven"
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-cache" % "2.4.6",
"org.specs2" %% "specs2-core" % "3.6" % "test",
"org.specs2" %% "specs2-junit" % "3.6" % "test",
"org.specs2" %% "specs2-scalacheck" % "3.6" % "test",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26"
)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
scalacOptions in Test ++= Seq("-Yrangepos")
fork in run := true
And here my project/plugin.sbt:
// The Typesafe repository
resolvers += "Typesafe repository" at "https://repo.typesafe.com/typesafe/releases/"
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.6")
Play has a shortcut to declare the test dependencies, including its own packages. The correct way to add specs2 and Play test classes is:
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-cache" % "2.4.6",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
specs2 % Test
)
This is documented here. There is also a shortcut to use cache, as also documented here. So your dependencies should be declared like this:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
cache,
specs2 % Test
)
The advantage here is that you don't need to track the dependencies that are compatible with Play. Also, you don't need to repeat Play version all over your dependencies, just at the project/plugins.sbt file.
Of course, you can still override and add any other dependencies as you like. You were adding scalacheck, per instance:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
cache,
specs2 % Test,
"org.specs2" %% "specs2-scalacheck" % "3.6" % Test
)
Edit after discussion:
Be welcome to Dependency Hell. Looks like that play2-reactivemongo and play2-reactivemongo-mocks are adding a very old specs2 dependency. You can see that by using sbt-dependency-graph and running sbt dependencyTree. Here is the complete output and also the relevant section:
[info] +-com.themillhousegroup:play2-reactivemongo-mocks_2.11:0.11.9_0.4.27 [S]
[info] | +-org.reactivemongo:play2-reactivemongo_2.11:0.11.10 [S]
[info] | +-org.specs2:specs2_2.11:2.3.13 [S]
You can also see that by seeing the code for play2-reactivemongo-mocks, play2-reactivemongo and Play Framework 2.4.6. These are not compatible versions of specs2 and sbt is not able to evict old versions because the projects are all adding different packages of specs2 (see how play add specific dependencies in contrast with play2-reactivemongo-mocks).
In other words, looks like the test supported offered by play2-reactivemongo-mocks is not compatible with the test support offered by Play. You can open an issue or submit a pull request to solve this, but a new version of play2-reactivemongo-mocks is necessary.
A possible solution
Exclude specs2 from play2-reactive dependencies:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.10" exclude("org.specs2", "*"),
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.27" exclude("org.specs2", "*"),
cache,
specs2 % Test,
"org.specs2" %% "specs2-scalacheck" % "3.6" % Test
)

SBT cannot append Seq[Object] to Seq[ModuleID]

SBT keeps failing with improper append errors. Im using the exact format of build files I have seen numerous times.
build.sbt:
lazy val backend = (project in file("backend")).settings(
name := "backend",
libraryDependencies ++= (Dependencies.backend)
).dependsOn(api).aggregate(api)
dependencies.scala:
import sbt._
object Dependencies {
lazy val backend = common ++ metrics
val common = Seq(
"com.typesafe.akka" %% "akka-actor" % Version.akka,
"com.typesafe.akka" %% "akka-cluster" % Version.akka,
"org.scalanlp.breeze" %% "breeze" % Version.breeze,
"com.typesafe.akka" %% "akka-contrib" % Version.akka,
"org.scalanlp.breeze-natives" % Version.breeze,
"com.google.guava" % "guava" % "17.0"
)
val metrics = Seq("org.fusesource" % "sigar" % "1.6.4")
Im Im not quite why SBT is complaining
error: No implicit for Append.Values[Seq[sbt.ModuleID], Seq[Object]] found,
so Seq[Object] cannot be appended to Seq[sbt.ModuleID]
libraryDependencies ++= (Dependencies.backend)
^
Short Version (TL;DR)
There's an error in common: you want to replace this line
"org.scalanlp.breeze-natives" % Version.breeze,
with this line
"org.scalanlp" %% "breeze-natives" % Version.beeze,
Long Version
"org.scalanlp.breeze-natives" % Version.breeze is a GroupArtifactID not a ModuleID.
This causes common to become a Seq[Object] instead of a Seq[ModuleID].
And therefore also Dependencies.backend to be a Seq[Object]
Which ultimately can't be appended (via ++=) to libraryDependencies (defined as a SettingKey[Seq[ModuleID]]) because there is no available Append.Values[Seq[sbt.ModuleID], Seq[Object]].
One of common or metrics is not a Seq[sbt.ModuleID]. You could find out which with a type ascription:
val common: Seq[sbt.ModuleID] = ...
val metrics: Seq[sbt.ModuleID] = ...
My money is on common, this line doesn't have enough %s in it:
"org.scalanlp.breeze-natives" % Version.breeze