Slick 3.2.x on unsupported database - scala

Let me refresh already asked question, as the answer there is not clear for newbie.
I'm trying to start with Play, Slick 3.2.3 and unsupported database (RDB to be precise). I began from play-scala-isolated-slick-example taken from Play site. RDB database is not supported by Slick, so I tried to use generic Jdbc profile (fit-all as I think):
package test.mydb.slick
import javax.inject.{Inject, Singleton}
import slick.driver.JdbcProfile
import slick.jdbc.JdbcBackend.Database
import test.mydb.{MyTblDAO, Tbl} // case class defined there
import scala.concurrent.{ExecutionContext, Future}
import scala.language.implicitConversions
import scala.reflect.ClassTag
#Singleton
class SlickMyTblDAO #Inject()(db: Database)(implicit ec: ExecutionContext)
extends MyTblDAO with test.mydb.slick.Tables {
// override val profile: JdbcProfile = _root_.slick.jdbc.JdbcProfile
override val profile: JdbcProfile = slick.driver.JdbcProfile
import profile.api._
def lookup(id: String): Future[Option[MyTbl]] = {.... and so on
This code is not compiled because of:
type mismatch;
[error] found : slick.driver.JdbcProfile.type
[error] required: slick.driver.JdbcProfile
[error] (which expands to) slick.jdbc.JdbcProfile
[error] override val profile: JdbcProfile = slick.driver.JdbcProfile
[error] ^
Not sure I fully understand the root of the problem, but I guess one can't use Jdbc profile directly. The answer says that "other databases can be supported with a custom implementation of the trait slick.jdbc.JdbcProfile". Does it mean that I need to implement profile myself? Is it achievable for starter? I need just a simple DML, no DDL, no joins for start.

The error message is telling you that profile needs to extend the trait JdbcProfile, but you're passing it the companion object JdbcProfile, which does not extend the trait of the same name.
To answer your other question - yes, I'm afraid you would have to implement JdbcProfile yourself, and I believe that could be quite a mouthful for a newbie, because Slick's API is quite advanced.

Related

Scala object load implicits

When defining a class where I need to load my Akka implicits from Main I do like this:
class Example()(implicit system: ActorSystem, materializer: ActorMaterializer)
However I wish to do the exact same thing using an object, is there a way to do this?

Scala multiple imports doesn't compiles

I'm a new in Scala. I created a package object in my code:
package mypackage.spark
import scala.language.implicitConversions
import org.apache.spark.SparkContext
import mypackage.spark.SparkContextFunctions
package object spark {
implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions =
new SparkContextFunctions(sc)
}
I expect that when I use import mypackage.spark._, I will able to use methods from SparkContextFunctions class. This approach works for me, when I use only only one imported package object. But when I add additional import in my code. For example:
import mypackage.spark._
import com.datastax.spark.connector._
com.datastax.spark.connector._ doing the same for org.apache.spark.SparkContext class. My code stop compile and I have an error that used method is not a member of SparkContext class. When I change the order of imports the compiler starts see methods from mypackage.spark._ and stops see methods from com.datastax.spark.connector._
Maybe I missed something? Or Scala doesn't support this?
Thanks.
If you need to use two classes named SparkContext at the same time, you can alias them:
import my.package.name.{SparkContext => MySparkContext}
import some.other.package.name.{SparkContext => OtherSparkContext}
Classes from the same package you can be aliased in the same import:
import my.package.name.{SparkContext => MySparkContext, SomethingElse => MySomethingElse}
You may want to choose better names than MyXXX and OtherXXX.
The imports may conflict in two ways: either both use toSparkContextFunctions for the implicit conversion name or both provide extension methods with the same name (even if different signature).
If neither is the case, there should be no problem. If one is, change your method names, since you can't change the ones in com.datastax.spark.connector.

Testing Actors in Akka

When i run base example for testing actors:
class MySpec(_system: ActorSystem) extends TestKit(_system) with ImplicitSender
with WordSpec with MustMatchers with BeforeAndAfterAll {
I got error:
class WordSpec needs to be a trait to be mixed in
what am I doing wrong?
In ScalaTest 2.0 you can find both class and trait for WordSpec. The class named WordSpec and trait is WordSpecLike. So just use WordSpecLike instead of WordSpec:
class MySpec(_system: ActorSystem) extends TestKit(_system) with ImplicitSender
with WordSpecLike with MustMatchers with BeforeAndAfterAll {
In addition to what 1esha proposed, there's one more solution in akka documentation
If for some reason it is a problem to inherit from TestKit due to it being a concrete class instead of a trait, there’s TestKitBase:
import akka.testkit.TestKitBase
class MyTest extends TestKitBase {
implicit lazy val system = ActorSystem()
// put your test code here ...
shutdown(system)
}
The implicit lazy val system must be declared exactly like that (you can of course pass arguments to the actor system factory as needed) because trait TestKitBase needs the system during its construction.
As of Scalatest 2.0, the Specs you mix in are now classes not traits. Which means you can't use them with Akka's test kit... both are classes and you can only extend one of them.
Switch to scalatest 1.9.1. It's still supported by them and is the last version that released before they made that change and broke things for akka users. In 1.9.1, the specs are still traits.

SCALA - import a trait in a worksheet

Everything is in the title : How do I do for import a trait in a worksheet for testing it? I have a function in that trait that I want to test..
Just import it like you always would:
import com.mytrait.MyTrait

Alias library implicits in a package object

Say I have:
import org.scalatest.ShouldMatchers._;
This brings a few implicit conversions into scope.
How can I alias them in a package object so that I can bring the implicits into scope with:
import code.ThePackageObject._;
Apparently the ShouldMatchers object extends the ShouldMatchers trait (where the actual definition of the implicits are done). This is a common idiom that allows to simply mix the trait where you need it. So you can simply mix ShouldMatchers (the trait) in your package object:
package object ThePackageObject extends ShouldMatchers