Scala Slick - bi-directional mapping operator cannot be found - postgresql

Using the Slick version 3.1.1 [enter link description here][1]
[1]: http://slick.lightbend.com/download/ I want to create a projection that maps to the case class of mine. However, the problem I keep getting is that the operator for bi-directional mapping cannot be found. Here is a quick look onto the dependencies defined in the .sbt file:
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "3.1.1",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
Whereas the case class and the Table[T] derivation I've defined are as follows:
import slick.driver.PostgresDriver._
import slick.lifted._
case class Log(created_at: String, file: String, line: String, systemName: String, systemVersion: String, networkStatus: Boolean, message: String)
class LogTable(tag: Tag) extends Table[Log](tag, "log") {
def created_at = column[String]("created_at")
def file = column[String]("file")
def line = column[String]("line")
def systemName = column[String]("systemName")
def systemVersion = column[String]("systemVersion")
def networkStatus = column[Boolean]("networkStatus")
def message = column[String]("message")
def * = (created_at, file, line, systemName, systemVersion, networkStatus, message) <> (LogKitIOS.tupled, LogKitIOS.unapply _)
}
Does anyone have an idea onto why I am not able to use the <> operator?

Related

Compare only string part inside case class

Is it possible to compare only part of string in property of case class.
Using scala specs2 lib.
Is there is a possibility to write such matcher?
CaseClass(property = Prop("very long string")) must beEqualTo(CaseClass(property = Prop("%long%")))
Try case class matchers like so
foo must matchA[Foo].property(_ must =~("long"))
Here is a working example
import scala.language.experimental.macros
import org.specs2._
import org.specs2.matcher.MatcherMacros
class QuickStartSpec extends Specification with MatcherMacros { def is = s2"""
The 'Case class matchers' should
match on a part of a string $e1
"""
def e1 = {
case class Foo(property: String)
val foo = Foo(property = "very long string")
foo must matchA[Foo].property(_ must =~("long"))
}
}
where
libraryDependencies ++= Seq(
"org.specs2" %% "specs2-core" % "4.6.0" % "test",
"org.specs2" %% "specs2-matcher-extra" % "4.6.0" % "test"
),
With standard matchers
myCaseInst.property.myString must be matching(".*long.*")

ambiguous implicit values: match expected type cats.derived.MkShow[A]: show cats:kittens

I'm trying to create Show Instance for my custom Config class.
The build.sbt file is -
name := "circe-demo"
version := "0.1"
scalaVersion := "2.11.12"
resolvers += Resolver.bintrayRepo("ovotech", "maven")
libraryDependencies += "io.circe" %% "circe-core" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-parser" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-generic" % "0.11.0"
libraryDependencies += "org.typelevel" %% "kittens" % "1.2.0"
libraryDependencies ++= Seq(
"is.cir" %% "ciris-cats",
"is.cir" %% "ciris-cats-effect",
"is.cir" %% "ciris-core",
"is.cir" %% "ciris-enumeratum",
"is.cir" %% "ciris-refined"
).map(_ % "0.12.1")
Complete code is -
import enumeratum.{Enum, EnumEntry}
sealed abstract class AppEnvironment extends EnumEntry
object AppEnvironment extends Enum[AppEnvironment] {
case object Local extends AppEnvironment
case object Testing extends AppEnvironment
case object Production extends AppEnvironment
override val values: Vector[AppEnvironment] =
findValues.toVector
}
import java.net.InetAddress
import scala.concurrent.duration.Duration
final case class ApiConfig(host: InetAddress, port: Int, apiKey: String, timeout: Duration)
import java.net.InetAddress
import cats.Show
import cats.derived.semi
import ciris.config.loader.AppEnvironment.{Local, Production, Testing}
import enumeratum.EnumEntry
import eu.timepit.refined.auto._
import eu.timepit.refined.types.string.NonEmptyString
import scala.concurrent.duration._
final case class Config(appName: NonEmptyString, environment: AppEnvironment, api: ApiConfig)
object Config {
implicit val showConfig: Show[Config] = {
implicit val showDuration: Show[Duration] =
Show.fromToString
implicit val showInetAddress: Show[InetAddress] =
Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] =
Show.show(_.entryName)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
}
semi.show in the above code throws the below exception -
[error] /Users/rajkumar.natarajan/Documents/Coding/kafka_demo/circe-demo/src/main/scala/ciris/config/loader/Config.scala:32:5: ambiguous implicit values:
[error] both value emptyProductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.HNil]
[error] and method emptyCoproductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.CNil]
[error] match expected type cats.derived.MkShow[A]
[error] show
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error]
I'm new to functional programming using cats.
How can I resolve this exception.
Unfortunately error reporting when such complicated implicits and macros are involved is far from perfect. The message you see actually means that some required implicits for the real generator (MkShow.genericDerivedShowProduct in this case) have not been found and the search went back to some where basic stuff where there is an ambiguity. And the stuff that is missing is mostly very basic such as an implicit for Show[Int] or Show[String]. The simplest way to get them all is to import cats.implicits._ but that will also bring catsStdShowForDuration which is a Show[Duration]. But since it's implementation is really the same as your custom one, it is easier to remove your custom one. One more thing that is missing is Show[NonEmptyString] and it is easy to create one
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
To sum up, when I define your showConfig as
implicit val showConfig: Show[Config] = {
import cats.implicits._
// is already defined in cats.implicits._
//implicit val showDuration: Show[Duration] = Show.fromToString
implicit val showInetAddress: Show[InetAddress] = Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] = Show.show(_.entryName)
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
it compiles for me.
P.S. is there any good reason why you put your AppEnvironment under ciris.* package? I'd say that generally putting your custom code into packages of 3-rd party library is an easy way to mess things up.

Scala Slick compose column type mappers

I'm using slick-pg which adds support(with implicits) for the List and DateTime types in slick.
Unfortunately I cannot use List[DateTime] - slick does not understand composition of those types, but I've checked that both work correctly on their own(for example List[Int] and DateTime).
Is there a way to easily compose those two implicits?
1. Try to add
implicit def dateTimeList =
MappedColumnType.base[List[DateTime], List[Timestamp]](
_.map(dt => new Timestamp(dt.getMillis)),
_.map(ts => new DateTime(ts.getTime))
)
Just in case, the whole code that compiles:
import java.sql.Timestamp
import org.joda.time.DateTime
import slick.jdbc.PostgresProfile.api._
import slick.lifted.ProvenShape
import slick.basic.Capability
import slick.jdbc.JdbcCapabilities
import com.github.tototoshi.slick.PostgresJodaSupport._
import com.github.tminglei.slickpg._
object App {
trait MyPostgresProfile extends ExPostgresProfile
with PgArraySupport
with PgDate2Support
with PgRangeSupport
with PgHStoreSupport
// with PgPlayJsonSupport
with PgSearchSupport
// with PgPostGISSupport
with PgNetSupport
with PgLTreeSupport {
def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
// Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
override protected def computeCapabilities: Set[Capability] =
super.computeCapabilities + /*JdbcProfile.capabilities.insertOrUpdate*/ JdbcCapabilities.insertOrUpdate
override val api = MyAPI
object MyAPI extends API with ArrayImplicits
with DateTimeImplicits
// with JsonImplicits
with NetImplicits
with LTreeImplicits
with RangeImplicits
with HStoreImplicits
with SearchImplicits
with SearchAssistants {
implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
// implicit val playJsonArrayTypeMapper =
// new AdvancedArrayJdbcType[JsValue](pgjson,
// (s) => utils.SimpleArrayUtils.fromString[JsValue](Json.parse(_))(s).orNull,
// (v) => utils.SimpleArrayUtils.mkString[JsValue](_.toString())(v)
// ).to(_.toList)
}
}
object MyPostgresProfile extends MyPostgresProfile
import MyPostgresProfile.api._
// This can be used instead of slick-joda-mapper library
// implicit def dateTime =
// MappedColumnType.base[DateTime, Timestamp](
// dt => new Timestamp(dt.getMillis),
// ts => new DateTime(ts.getTime)
// )
implicit def dateTimeList =
MappedColumnType.base[List[DateTime], List[Timestamp]](
_.map(dt => new Timestamp(dt.getMillis)),
_.map(ts => new DateTime(ts.getTime))
)
case class Record(id: Int, name: String, friends: List[Int], registered: DateTime, visits: List[DateTime])
class RecordTable(tag: Tag) extends Table[Record](tag, Some("public"), "records") {
def id: Rep[Int] = column[Int]("id", O.PrimaryKey, O.AutoInc)
def name: Rep[String] = column[String]("name")
def friends: Rep[List[Int]] = column[List[Int]]("friends")
def registered: Rep[DateTime] = column[DateTime]("registered")
def visits: Rep[List[DateTime]] = column[List[DateTime]]("visits")
def * : ProvenShape[Record] = (id, name, friends, registered, visits) <> (Record.tupled, Record.unapply)
}
val records: TableQuery[RecordTable] = TableQuery[RecordTable]
}
build.sbt
name := "slickdemo"
version := "0.1"
scalaVersion := "2.12.3"
libraryDependencies += "com.typesafe.slick" %% "slick" % "3.2.1"
libraryDependencies += "org.slf4j" % "slf4j-nop" % "1.7.25"
libraryDependencies += "com.typesafe.slick" %% "slick-hikaricp" % "3.2.1"
libraryDependencies += "org.postgresql" % "postgresql" % "42.1.4"
libraryDependencies += "com.github.tminglei" %% "slick-pg" % "0.15.3"
libraryDependencies += "joda-time" % "joda-time" % "2.9.9"
libraryDependencies += "org.joda" % "joda-convert" % "1.9.2"
libraryDependencies += "com.github.tototoshi" % "slick-joda-mapper_2.12" % "2.3.0"
Based on answer and documentation.
2. Alternatively you can add
implicit val dateTimeArrayTypeMapper =
new AdvancedArrayJdbcType[DateTime]("timestamp",
(s) => utils.SimpleArrayUtils.fromString[DateTime](DateTime.parse)(s).orNull,
(v) => utils.SimpleArrayUtils.mkString[DateTime](_.toString)(v)
).to(_.toList)
after strListTypeMapper and playJsonArrayTypeMapper.

Circe Encoders and Decoders with Http4s

I am trying to use http4s, circe and http4s-circe.
Below I am trying to use the auto derivation feature of circe.
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s.Status.ResponseClass.Successful
import io.circe.syntax._
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe._
final case class Login(username: String, password: String)
final case class Token(token: String)
object JsonHelpers {
import io.circe.generic.auto._
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}
I get multiple instances of these two compiler errors
Error:scalac: missing or invalid dependency detected while loading class file 'GenericInstances.class'.
Could not access type Secondary in object io.circe.Encoder,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'GenericInstances.class' was compiled against an incompatible version of io.circe.Encoder.
Error:(25, 74) could not find implicit value for parameter encoder: io.circe.Encoder[Login]
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
I was able to solve this. I did a search on google on sbt circe dependency and I copy pasted the first search result. that was circe 0.1 and that is why things were not working for me.
I changed my dependencies to
libraryDependencies ++= Seq(
"org.http4s" %% "http4s-core" % http4sVersion,
"org.http4s" %% "http4s-dsl" % http4sVersion,
"org.http4s" %% "http4s-blaze-client" % http4sVersion,
"org.http4s" %% "http4s-circe" % http4sVersion,
"io.circe" %% "circe-core" % "0.7.0",
"io.circe" %% "circe-generic" % "0.7.0"
)
and now automatic derivation works fine and I am able to compile the code below
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe.syntax._
import io.circe.generic.auto._
import org.http4s.Status.ResponseClass.Successful
case class Login(username: String, password: String)
case class Token(token: String)
object JsonHelpers {
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}

Misunderstanding error about ReactiveMongo

I defined the following class which I want to modelize :
case class Record(
recordKey: String,
channels: Map[String, Channel],
)
object Record {
implicit val RecordFormat = Json.format[Record]
}
Now, I want to get one object of this type from reactive mongo like this (in an another class):
import scala.concurrent.duration._
import scala.concurrent.{Future, Await}
import scala.concurrent.ExecutionContext.Implicits.global
import reactivemongo.api._
import reactivemongo.api.collections.bson.BSONCollection
import reactivemongo.bson.BSONDocument
object Test {
val collection = connect()
val timeout = 10.seconds
def connect() : BSONCollection = {
val config = ConfigFactory.load()
val driver = new MongoDriver
val connection = driver.connection(List(config.getString("mongodb.uri")))
val db = connection("/toto")
db.collection("foo")
}
def findRecord(recordKey : String) : Record = {
return Test.collection
.find(BSONDocument("recordKey"->recordKey))
.one[Record]
}
But this code doesn't compile :
could not find implicit value for parameter reader: reactivemongo.bson.BSONDocumentReader[Record]
Could someone explain how to fix this issue ?
I also test that :
def findRecord(recordKey : String) : Record = {
val futureRecord : Future[Option[Record]] =
Test.collection
.find(BSONDocument("recordKey"->recordKey))
.one[Record]
return Await.result(futureRecord, 10.seconds).getOrElse(null)
}
I also added my build.sbt :
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming_2.10" % "1.5.2",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.2",
"org.slf4j" % "slf4j-api" % "1.7.13",
"org.slf4j" % "slf4j-simple" % "1.7.13",
"com.amazonaws" % "aws-java-sdk" % "1.10.12",
"com.typesafe.play" % "play-json_2.10" % "2.4.6",
"com.typesafe" % "config" % "1.3.0",
"org.scalaj" %% "scalaj-http" % "2.2.1",
"com.typesafe.akka" % "akka-actor_2.10" % "2.3.14",
"org.reactivemongo" %% "reactivemongo" % "0.11.9",
"com.github.nscala-time" %% "nscala-time" % "2.6.0"
)
Note that is it not a Play App.
You need to define a BSONDocumentReader for your case class Record. Here is a link to the Documentation. Very similar to Play JSON Readers and Writers Reactive Mongo needs to understand how to convert back and forth between your domain object and BSONDocument. Similar to Play JSON as well you can write these out in a more manual style(Write a BSONDocumentReader instance and a BSONDocumentWriter instance) and customize every detail and apply transformations etc. Similar in style to Play JSON's format you used above ReactiveMongo does provide helpful macros to generate these classes for you.
For your Record class you would need to add an implicit val like this to your object:
import reactivemongo.bson._
implicit val recordHandler: BSONHandler[BSONDocument, Record] = Macros.handler[Record]
/* Or only one of these [if your only ever writing or reading this data etc:
implicit val recordReader: BSONDocumentReader[Record] = Macros.reader[Record]
implicit val recordWriter: BSONDocumentWriter[Record] = Macros.writer[Record]
*/
I would say try to start with the Macros and see if those meet your needs. If you need more control of the processing/transformation you can define your own BSONDocumentReader and BSONDocumentWriter instances.
updated record class
import play.api.libs.json.Json
import reactivemongo.bson._
case class Channel(label: String,amplitude: Double,position: Option[String])
object Channel {
implicit val ChannelFormat = Json.format[Channel]
implicit val channelHandler: BSONHandler[BSONDocument, Channel] = Macros.handler[Channel]
}
object RecordType extends Enumeration {
type RecordType = Value
val T1 = Value
implicit val enumFormat = new Format[RecordType] {
def reads(json: JsValue) = JsSuccess(RecordType.withName(json.as[String]))
def writes(enum: RecordType) = JsString(enum.toString)
}
implicit object RecordTypeReader extends BSONDocumentReader[RecordType] {
def read(doc: BSONDocument) : RecordType = {
RecordType.withName(doc.getAs[String]("recordType").get)
}
}
implicit object RecordTypeWriter extends BSONDocumentWriter[RecordType] {
def write(recordType: RecordType) : BSONDocument = BSONDocument(
"recordType" -> BSONString(recordType.toString)
)
}
}
case class Record(recordKey: String,recordType: RecordType.Value,channels: Map[String, Channel])
object Record {
implicit val RecordFormat = Json.format[Record]
implicit val recordHandler: BSONHandler[BSONDocument, Record] = Macros.handler[Record]
}