How to fix the No implementation for play.api.db.DBApi was bound. with play 2.8.13 - postgresql

we are upgrading the tech stack versions of my project due to log4j vulnerability.
SBT 1.6.2,
Scala 2.13.6,
play_scala 2.8.13
I used below dependencies to connect database and I mentioned db configurations in application.conf file.Also using customApplicationLoader.scala file
"com.typesafe.play" %% "play-slick" % "5.0.0",
"org.postgresql" % "postgresql" % "9.4-1206-jdbc42",
"com.typesafe.play" %% "play-slick-evolutions" % "5.0.0" % Test
import play.api.ApplicationLoader
import play.api.inject.guice.{GuiceApplicationLoader, GuiceApplicationBuilder}
class CustomApplicationLoader extends GuiceApplicationLoader {
override protected def builder(context: ApplicationLoader.Context): GuiceApplicationBuilder = {
super.builder(context).disableCircularProxies(true)
}
}
but I'm facing the below issue
How can I resolve this please suggest here. Thank you

Thank you all, After adding libraryDependencies += "com.typesafe.play" %% "play-guice" % "2.8.15" dependency and adding below configurations in application.conf my issue resolved.
play.evolutions.enabled = false
play.evolutions.db.default.autoApply = true
play.modules.disabled += "play.api.db.evolutions.EvolutionsModule"

Related

No implementation Slick DatabaseConfigProvider was bound

I have just created a dummy project and try to integrate Play with Slick. I followed the official tutorial but unfortunatelly did not manage do run it properly.
Everytime I try to run the app I get following error:
play.api.UnexpectedException: Unexpected exception[ProvisionException: Unable to provision, see the following errors:
1) No implementation for play.api.db.slick.DatabaseConfigProvider was bound.
while locating play.api.db.slick.DatabaseConfigProvider
for the 1st parameter of com.reciper.repository.UserRepository.<init>(UserRepository.scala:13)
Here are my configs:
build.sbt
scalaVersion := "2.12.2"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % Test
libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.3"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.3"
libraryDependencies += "org.postgresql" % "postgresql" % "42.2.4"
application.conf
play.evolutions {
autoApply = true
}
#Slick for Play
slick.profile = "slick.jdbc.PostgresProfile$"
slick.db.driver = "org.postgresql.Driver"
slick.db.url = "jdbc:postgresql://localhost:5432/reciper"
slick.db.user = "postgres"
slick.db.password = "postgres"
UserRepository.scala
#Singleton
class UserRepository #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
(implicit executionContext: ExecutionContext) extends HasDatabaseConfigProvider[PostgresProfile] { ..codehere.. }
HomeController.scala
#Singleton
class HomeController #Inject()(repo: UserRepository) {...}
plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.6.13")
I have been struggling with it for more than 3 days now and lost my hope that it will actually work.. tried many options, none worked
Do you know what is missing or wrong? Let me know if you need any other file
Thanks!
Following configuration in application.conf works
build.sbt
"com.typesafe.play" %% "play-slick" % "3.0.3"
application.conf
slick.dbs.default.driver="slick.driver.PostgresDriver$"
slick.dbs.default.db.driver="org.postgresql.Driver"
slick.dbs.default.db.url="jdbc:postgresql://ec2-54-217-243-228.eu-west-1.compute.amazonaws.com:5432/d344onl0761ji5"
slick.dbs.default.db.user=user
slick.dbs.default.db.password="pass"

SBT Verify Error caused by multiple protobuf 2/3 dependencies in spite of shading

I am struggling with Verify Errors with this below sample project using Scio/Bigtable/HBase. The dependency tree requires protobuf version (2.5, 2.6.1, 3.0, 3.1) and seems to default to 3.2. I used the shading component of sbt-assembly, not sure I am right with it.
My build.sbt:
name := "scalalab"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.spotify" %% "scio-core" % "0.3.2",
"com.google.cloud.bigtable" % "bigtable-hbase-1.2" % "0.9.2",
"org.apache.hbase" % "hbase-client" % "1.2.5",
"org.apache.hbase" % "hbase-common" % "1.2.5",
"org.apache.hadoop" % "hadoop-common" % "2.7.3"
)
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.#1").inAll
)
assemblyMergeStrategy in assembly := {
case x => MergeStrategy.first
}
My Main.scala:
import com.google.cloud.bigtable.hbase.adapters.Adapters
import com.google.cloud.bigtable.hbase.adapters.read.DefaultReadHooks
import org.apache.hadoop.hbase.client.Scan
object Main {
def main(args: Array[String]): Unit = {
val scan = new Scan
scan.setMaxVersions()
scan.addFamily("family".getBytes)
scan.setRowPrefixFilter("prefix".getBytes)
val builder = Adapters.SCAN_ADAPTER.adapt(scan, new DefaultReadHooks)
System.out.println(builder)
}
}
outputs to:
scalalab git:(master) ? java -cp .:target/scala-2.11/scalalab-assembly-1.0.jar Main
Exception in thread "main" java.lang.VerifyError: Bad return type
Exception Details:
Location:
shadeio/cloud/bigtable/hbase/adapters/AppendAdapter.adapt(Lorg/apache/hadoop/hbase/client/Operation;)Lshadeio/bigtable/repackaged/com/google/protobuf/GeneratedMessageV3$Builder; #8: areturn
Reason:
Type 'shadeio/bigtable/v2/ReadModifyWriteRowRequest$Builder' (current frame, stack[0]) is not assignable to 'shadeio/bigtable/repackaged/com/google/protobuf/GeneratedMessageV3$Builder' (from method signature)
Current Frame:
bci: #8
flags: { }
locals: { 'shadeio/cloud/bigtable/hbase/adapters/AppendAdapter', 'org/apache/hadoop/hbase/client/Operation' }
stack: { 'shadeio/bigtable/v2/ReadModifyWriteRowRequest$Builder' }
Bytecode:
0x0000000: 2a2b c000 28b6 00a0 b0
at shadeio.cloud.bigtable.hbase.adapters.Adapters.<clinit>(Adapters.java:40)
at Main$.main(Main.scala:12)
at Main.main(Main.scala)
What am I doing wrong ?
Thanks for your help
The dependency issue is quit a pain for the Cloud Bigtable HBase client. We'll be fixing it in the next release. To fix your current problem, import "bigtable-hbase" instead of "bigtable-hbase-1.2"
Also, I would suggest using the latest version of the client is possible, which is 0.9.7.1.
Scio has a bigtable artifact, which you can include via:
libraryDependencies ++= Seq(
"com.spotify" %% "scio-core" % "0.3.2",
"com.spotify" %% "scio-bigtable" % "0.3.2"
)
Usually, there is no need for the other dependencies to use Bigtable, in fact they might cause issues.
Make sure to then import bigtable package via:
import com.spotify.scio.bigtable._
Check this BigtableExample.
Side note you might want to give sbt-pack a try instead of sbt-assembly to fully leverage artifact caching. For an example of sbt setup check the template here.

Symbol 'type <none>.scalacheck.Shrink' is missing from the classpath

I have the following ScalaCheck unit test using Mockito:
import org.scalatest.mockito.MockitoSugar
import org.mockito.Mockito.when
import org.scalatest.prop.PropertyChecks
import org.mockito.Mockito.verify
class PlayerTest extends org.scalatest.FunSuite with MockitoSugar with PropertyChecks {
test("doesn't accept anything but M") {
val mockIOHandler = mock[IOHandler]
val name = "me"
val player = new Player(name)
when(mockIOHandler.nextLine).thenReturn("m")
val apiUser = new Player("player1")
apiUser.chooseHand(mockIOHandler)
verify(mockIOHandler).write("some value")
}
}
In my build.sbt I have the following dependencies:
scalaVersion := "2.12.2"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
// https://mvnrepository.com/artifact/org.mockito/mockito-core
libraryDependencies += "org.mockito" % "mockito-core" % "1.8.5"
For it, I am getting this error:
Error:(12, 41) Symbol 'type <none>.scalacheck.Shrink' is missing from the classpath.
This symbol is required by 'value org.scalatest.prop.GeneratorDrivenPropertyChecks.shrA'.
Make sure that type Shrink is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'GeneratorDrivenPropertyChecks.class' was compiled against an incompatible version of <none>.scalacheck.
test("doesn't accept anything but M") {
Any idea what could be wrong?
Adding scalacheck did the trick for me
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.+"
lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.+"

Compilation errors with spark cassandra connector and SBT

I'm trying to get the DataStax spark cassandra connector working. I've created a new SBT project in IntelliJ, and added a single class. The class and my sbt file is given below. Creating spark contexts seem to work, however, the moment I uncomment the line where I try to create a cassandraTable, I get the following compilation error:
Error:scalac: bad symbolic reference. A signature in CassandraRow.class refers to term catalyst
in package org.apache.spark.sql which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraRow.class.
Sbt is kind of new to me, and I would appreciate any help in understanding what this error means (and of course, how to resolve it).
name := "cassySpark1"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc()
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.1.0-alpha2" withSources() withJavadoc()
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
And my class:
import org.apache.spark.{SparkConf, SparkContext}
import com.datastax.spark.connector._
object HelloWorld { def main(args:Array[String]): Unit ={
System.setProperty("spark.cassandra.query.retry.count", "1")
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "cassandra-hostname")
.set("spark.cassandra.username", "cassandra")
.set("spark.cassandra.password", "cassandra")
val sc = new SparkContext("local", "testingCassy", conf)
> //val foo = sc.cassandraTable("keyspace name", "table name")
val rdd = sc.parallelize(1 to 100)
val sum = rdd.reduce(_+_)
println(sum) } }
You need to add spark-sql to dependencies list
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.1.0"
Add library dependency in your project's pom.xml file. It seems they have changed the Vector.class dependencies location in the new refactoring.

Error in hello world spray app with scala 2.11

I'm trying to get a simple "hello world" server running using spray with scala 2.11:
import spray.routing.SimpleRoutingApp
import akka.actor.ActorSystem
object SprayTest extends App with SimpleRoutingApp {
implicit val system = ActorSystem("my-system")
startServer(interface = "localhost", port = 8080) {
path("hello") {
get {
complete {
<h1>Say hello to spray</h1>
}
}
}
}
}
However, I receive the following compile errors:
Multiple markers at this line
- not found: value port
- bad symbolic reference to spray.can encountered in class file 'SimpleRoutingApp.class'. Cannot
access term can in package spray. The current classpath may be missing a definition for spray.can, or
SimpleRoutingApp.class may have been compiled against a version that's incompatible with the one
found on the current classpath.
- not found: value interface
Does anyone know what might be the issue? BTW, I'm very new to spray and actors, so I lack a lot of intuition for how spray and actors work (that's why I'm doing this simple tutorial).
Finally found the answer myself. I needed to add the spray-can dependency to my pom file. Leaving this question and answer in case anyone else runs into the same problem.
SBT example:
scalaVersion := "2.10.4"
val akkaVersion = "2.3.6"
val sprayVersion = "1.3.2"
resolvers ++= Seq(
"Spray Repository" at "http://repo.spray.io/"
)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion
)