Misunderstanding error about ReactiveMongo - mongodb

I defined the following class which I want to modelize :
case class Record(
recordKey: String,
channels: Map[String, Channel],
)
object Record {
implicit val RecordFormat = Json.format[Record]
}
Now, I want to get one object of this type from reactive mongo like this (in an another class):
import scala.concurrent.duration._
import scala.concurrent.{Future, Await}
import scala.concurrent.ExecutionContext.Implicits.global
import reactivemongo.api._
import reactivemongo.api.collections.bson.BSONCollection
import reactivemongo.bson.BSONDocument
object Test {
val collection = connect()
val timeout = 10.seconds
def connect() : BSONCollection = {
val config = ConfigFactory.load()
val driver = new MongoDriver
val connection = driver.connection(List(config.getString("mongodb.uri")))
val db = connection("/toto")
db.collection("foo")
}
def findRecord(recordKey : String) : Record = {
return Test.collection
.find(BSONDocument("recordKey"->recordKey))
.one[Record]
}
But this code doesn't compile :
could not find implicit value for parameter reader: reactivemongo.bson.BSONDocumentReader[Record]
Could someone explain how to fix this issue ?
I also test that :
def findRecord(recordKey : String) : Record = {
val futureRecord : Future[Option[Record]] =
Test.collection
.find(BSONDocument("recordKey"->recordKey))
.one[Record]
return Await.result(futureRecord, 10.seconds).getOrElse(null)
}
I also added my build.sbt :
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming_2.10" % "1.5.2",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.2",
"org.slf4j" % "slf4j-api" % "1.7.13",
"org.slf4j" % "slf4j-simple" % "1.7.13",
"com.amazonaws" % "aws-java-sdk" % "1.10.12",
"com.typesafe.play" % "play-json_2.10" % "2.4.6",
"com.typesafe" % "config" % "1.3.0",
"org.scalaj" %% "scalaj-http" % "2.2.1",
"com.typesafe.akka" % "akka-actor_2.10" % "2.3.14",
"org.reactivemongo" %% "reactivemongo" % "0.11.9",
"com.github.nscala-time" %% "nscala-time" % "2.6.0"
)
Note that is it not a Play App.

You need to define a BSONDocumentReader for your case class Record. Here is a link to the Documentation. Very similar to Play JSON Readers and Writers Reactive Mongo needs to understand how to convert back and forth between your domain object and BSONDocument. Similar to Play JSON as well you can write these out in a more manual style(Write a BSONDocumentReader instance and a BSONDocumentWriter instance) and customize every detail and apply transformations etc. Similar in style to Play JSON's format you used above ReactiveMongo does provide helpful macros to generate these classes for you.
For your Record class you would need to add an implicit val like this to your object:
import reactivemongo.bson._
implicit val recordHandler: BSONHandler[BSONDocument, Record] = Macros.handler[Record]
/* Or only one of these [if your only ever writing or reading this data etc:
implicit val recordReader: BSONDocumentReader[Record] = Macros.reader[Record]
implicit val recordWriter: BSONDocumentWriter[Record] = Macros.writer[Record]
*/
I would say try to start with the Macros and see if those meet your needs. If you need more control of the processing/transformation you can define your own BSONDocumentReader and BSONDocumentWriter instances.
updated record class
import play.api.libs.json.Json
import reactivemongo.bson._
case class Channel(label: String,amplitude: Double,position: Option[String])
object Channel {
implicit val ChannelFormat = Json.format[Channel]
implicit val channelHandler: BSONHandler[BSONDocument, Channel] = Macros.handler[Channel]
}
object RecordType extends Enumeration {
type RecordType = Value
val T1 = Value
implicit val enumFormat = new Format[RecordType] {
def reads(json: JsValue) = JsSuccess(RecordType.withName(json.as[String]))
def writes(enum: RecordType) = JsString(enum.toString)
}
implicit object RecordTypeReader extends BSONDocumentReader[RecordType] {
def read(doc: BSONDocument) : RecordType = {
RecordType.withName(doc.getAs[String]("recordType").get)
}
}
implicit object RecordTypeWriter extends BSONDocumentWriter[RecordType] {
def write(recordType: RecordType) : BSONDocument = BSONDocument(
"recordType" -> BSONString(recordType.toString)
)
}
}
case class Record(recordKey: String,recordType: RecordType.Value,channels: Map[String, Channel])
object Record {
implicit val RecordFormat = Json.format[Record]
implicit val recordHandler: BSONHandler[BSONDocument, Record] = Macros.handler[Record]
}

Related

Mapping a string to case classes using Scala play

Using the Scala play library I'm attempting to parse the string :
var str = "{\"payload\": \"[{\\\"test\\\":\\\"123\\\",\\\"tester\\\":\\\"456\\\"}," +
"{\\\"test1\\\":\\\"1234\\\",\\\"tester2\\\":\\\"4567\\\"}]\"}";
into a list of Payload classes using code below :
import play.api.libs.json._
object TestParse extends App {
case class Payload(test : String , tester : String)
object Payload {
implicit val jsonFormat: Format[Payload] = Json.format[Payload]
}
var str = "{\"payload\": \"[{\\\"test\\\":\\\"123\\\",\\\"tester\\\":\\\"456\\\"}," +
"{\\\"test1\\\":\\\"1234\\\",\\\"tester2\\\":\\\"4567\\\"}]\"}";
println((Json.parse(str) \ "payload").as[List[Payload]])
}
build.sbt :
name := "akka-streams"
version := "0.1"
scalaVersion := "2.12.8"
lazy val akkaVersion = "2.5.19"
lazy val scalaTestVersion = "3.0.5"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion,
"org.scalatest" %% "scalatest" % scalaTestVersion
)
// https://mvnrepository.com/artifact/com.typesafe.play/play-json
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.10.0-RC6"
It fails with exception :
Exception in thread "main" play.api.libs.json.JsResultException: JsResultException(errors:List((,List(JsonValidationError(List("" is not an object),WrappedArray())))))
Is the case class structure incorrect ?
I've updated the code to :
import play.api.libs.json._
object TestParse extends App {
import TestParse.Payload.jsonFormat
object Payload {
implicit val jsonFormat: Format[RootInterface] = Json.format[RootInterface]
}
case class Payload (
test: Option[String],
tester: Option[String]
)
case class RootInterface (
payload: List[Payload]
)
val str = """{"payload": [{"test":"123","tester":"456"},{"test1":"1234","tester2":"4567"}]}"""
println(Json.parse(str).as[RootInterface])
}
which returns error :
No instance of play.api.libs.json.Format is available for scala.collection.immutable.List[TestParse.Payload] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
implicit val jsonFormat: Format[RootInterface] = Json.format[RootInterface]
This performs the task but there are cleaner solutions :
import akka.actor.ActorSystem
import akka.stream.scaladsl.{Flow, Sink, Source}
import org.scalatest.Assertions._
import spray.json.{JsObject, JsonParser}
import scala.concurrent.Await
import scala.concurrent.duration.DurationInt
object TestStream extends App {
implicit val actorSystem = ActorSystem()
val mapperFlow = Flow[JsObject].map(x => {
x.fields.get("payload").get.toString().replace("{", "")
.replace("}", "")
.replace("[", "")
.replace("]", "")
.replace("\"", "")
.replace("\\", "")
.split(":").map(m => m.split(","))
.toList
.flatten
.grouped(4)
.map(m => Test(m(1), m(3).toDouble))
.toList
})
val str = """{"payload": [{"test":"123","tester":"456"},{"test":"1234","tester":"4567"}]}"""
case class Test(test: String, tester: Double)
val graph = Source.repeat(JsonParser(str).asJsObject())
.take(3)
.via(mapperFlow)
.mapConcat(identity)
.runWith(Sink.seq)
val result = Await.result(graph, 3.seconds)
println(result)
assert(result.length == 6)
assert(result(0).test == "123")
assert(result(0).tester == 456 )
assert(result(1).test == "1234")
assert(result(1).tester == 4567 )
assert(result(2).test == "123")
assert(result(2).tester == 456 )
assert(result(3).test == "1234")
assert(result(3).tester == 4567 )
}
Alternative, ioiomatic Scala answers are welcome.

Scala Slick compose column type mappers

I'm using slick-pg which adds support(with implicits) for the List and DateTime types in slick.
Unfortunately I cannot use List[DateTime] - slick does not understand composition of those types, but I've checked that both work correctly on their own(for example List[Int] and DateTime).
Is there a way to easily compose those two implicits?
1. Try to add
implicit def dateTimeList =
MappedColumnType.base[List[DateTime], List[Timestamp]](
_.map(dt => new Timestamp(dt.getMillis)),
_.map(ts => new DateTime(ts.getTime))
)
Just in case, the whole code that compiles:
import java.sql.Timestamp
import org.joda.time.DateTime
import slick.jdbc.PostgresProfile.api._
import slick.lifted.ProvenShape
import slick.basic.Capability
import slick.jdbc.JdbcCapabilities
import com.github.tototoshi.slick.PostgresJodaSupport._
import com.github.tminglei.slickpg._
object App {
trait MyPostgresProfile extends ExPostgresProfile
with PgArraySupport
with PgDate2Support
with PgRangeSupport
with PgHStoreSupport
// with PgPlayJsonSupport
with PgSearchSupport
// with PgPostGISSupport
with PgNetSupport
with PgLTreeSupport {
def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
// Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
override protected def computeCapabilities: Set[Capability] =
super.computeCapabilities + /*JdbcProfile.capabilities.insertOrUpdate*/ JdbcCapabilities.insertOrUpdate
override val api = MyAPI
object MyAPI extends API with ArrayImplicits
with DateTimeImplicits
// with JsonImplicits
with NetImplicits
with LTreeImplicits
with RangeImplicits
with HStoreImplicits
with SearchImplicits
with SearchAssistants {
implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
// implicit val playJsonArrayTypeMapper =
// new AdvancedArrayJdbcType[JsValue](pgjson,
// (s) => utils.SimpleArrayUtils.fromString[JsValue](Json.parse(_))(s).orNull,
// (v) => utils.SimpleArrayUtils.mkString[JsValue](_.toString())(v)
// ).to(_.toList)
}
}
object MyPostgresProfile extends MyPostgresProfile
import MyPostgresProfile.api._
// This can be used instead of slick-joda-mapper library
// implicit def dateTime =
// MappedColumnType.base[DateTime, Timestamp](
// dt => new Timestamp(dt.getMillis),
// ts => new DateTime(ts.getTime)
// )
implicit def dateTimeList =
MappedColumnType.base[List[DateTime], List[Timestamp]](
_.map(dt => new Timestamp(dt.getMillis)),
_.map(ts => new DateTime(ts.getTime))
)
case class Record(id: Int, name: String, friends: List[Int], registered: DateTime, visits: List[DateTime])
class RecordTable(tag: Tag) extends Table[Record](tag, Some("public"), "records") {
def id: Rep[Int] = column[Int]("id", O.PrimaryKey, O.AutoInc)
def name: Rep[String] = column[String]("name")
def friends: Rep[List[Int]] = column[List[Int]]("friends")
def registered: Rep[DateTime] = column[DateTime]("registered")
def visits: Rep[List[DateTime]] = column[List[DateTime]]("visits")
def * : ProvenShape[Record] = (id, name, friends, registered, visits) <> (Record.tupled, Record.unapply)
}
val records: TableQuery[RecordTable] = TableQuery[RecordTable]
}
build.sbt
name := "slickdemo"
version := "0.1"
scalaVersion := "2.12.3"
libraryDependencies += "com.typesafe.slick" %% "slick" % "3.2.1"
libraryDependencies += "org.slf4j" % "slf4j-nop" % "1.7.25"
libraryDependencies += "com.typesafe.slick" %% "slick-hikaricp" % "3.2.1"
libraryDependencies += "org.postgresql" % "postgresql" % "42.1.4"
libraryDependencies += "com.github.tminglei" %% "slick-pg" % "0.15.3"
libraryDependencies += "joda-time" % "joda-time" % "2.9.9"
libraryDependencies += "org.joda" % "joda-convert" % "1.9.2"
libraryDependencies += "com.github.tototoshi" % "slick-joda-mapper_2.12" % "2.3.0"
Based on answer and documentation.
2. Alternatively you can add
implicit val dateTimeArrayTypeMapper =
new AdvancedArrayJdbcType[DateTime]("timestamp",
(s) => utils.SimpleArrayUtils.fromString[DateTime](DateTime.parse)(s).orNull,
(v) => utils.SimpleArrayUtils.mkString[DateTime](_.toString)(v)
).to(_.toList)
after strListTypeMapper and playJsonArrayTypeMapper.

Circe Encoders and Decoders with Http4s

I am trying to use http4s, circe and http4s-circe.
Below I am trying to use the auto derivation feature of circe.
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s.Status.ResponseClass.Successful
import io.circe.syntax._
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe._
final case class Login(username: String, password: String)
final case class Token(token: String)
object JsonHelpers {
import io.circe.generic.auto._
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}
I get multiple instances of these two compiler errors
Error:scalac: missing or invalid dependency detected while loading class file 'GenericInstances.class'.
Could not access type Secondary in object io.circe.Encoder,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'GenericInstances.class' was compiled against an incompatible version of io.circe.Encoder.
Error:(25, 74) could not find implicit value for parameter encoder: io.circe.Encoder[Login]
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
I was able to solve this. I did a search on google on sbt circe dependency and I copy pasted the first search result. that was circe 0.1 and that is why things were not working for me.
I changed my dependencies to
libraryDependencies ++= Seq(
"org.http4s" %% "http4s-core" % http4sVersion,
"org.http4s" %% "http4s-dsl" % http4sVersion,
"org.http4s" %% "http4s-blaze-client" % http4sVersion,
"org.http4s" %% "http4s-circe" % http4sVersion,
"io.circe" %% "circe-core" % "0.7.0",
"io.circe" %% "circe-generic" % "0.7.0"
)
and now automatic derivation works fine and I am able to compile the code below
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe.syntax._
import io.circe.generic.auto._
import org.http4s.Status.ResponseClass.Successful
case class Login(username: String, password: String)
case class Token(token: String)
object JsonHelpers {
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}

Expect a specific instance for mock in Scalamock

Why can I not tell a mock that it should expect an instance of a class without explicitly giving the type? Here is what I mean:
val myClass = new MyClass(...)
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
does not work, while
val myClass: MyClass = new MyClass(...)
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
does work. How come the type can not be inferred?
My testing part in build.sbt is
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.0" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml")
)
libraryDependencies += "org.scalamock" %% "scalamock-scalatest-support" % "3.3.0" % "test"
Since I was asked for MyClass (it is SpacePoint here):
trait SpacePoint {
val location: SpaceLocation
}
val sp = new SpacePoint {
override val location: SpaceLocation = new SpaceLocation(DenseVector(1.0, 1.0))
}
So actually it works for me. Let me mention that type inference in the code:
val myClass = new MyClass(...)
has nothing to do with ScalaMock but is guaranteed by scala itself. Below I will specify working sample with library versions and sources of the classes.
Testing libraries:
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2.2" % "test"
Source code of classes:
class MyClass(val something: String)
trait MyTrait {
def mymethod(smth: MyClass): Double
}
Source code of test:
import org.scalamock.scalatest.MockFactory
import org.scalatest.{Matchers, WordSpec}
class ScalamockTest extends WordSpec with MockFactory with Matchers {
"ScalaMock" should {
"infers type" in {
val myClass = new MyClass("Hello")
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
traitMock.mymethod(myClass) shouldBe 12.3
}
}
}
Hope it helps. Will be ready to update answer once you provide more details.

Scala play withSession deprecated

I'm migrating from slick 2.1 to 3.0. As you know the function withSession has been deprecated.
How can I change code bellow:
def insert(vote: Vote) = DB.withSession { implicit session =>
insertWithSession(vote)
}
def insertWithSession(vote: Vote)(implicit s: Session) = {
Votes.insert(vote)
}
I've got compile error on Votes.insert and the error is:
could not find implicit value for parameter s: slick.driver.PostgresDriver.api.Session
At last, Is there any document other than official link helping me to migrate. I need more details.
Assuming that you are using play-slick for slick integration with play.
You can look at https://www.playframework.com/documentation/2.5.x/PlaySlick for more details.
Add the slick and jdbc dependencies in build.sbt
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0"
"org.postgresql" % "postgresql" % "9.4-1206-jdbc4"
)
Add the postgres config in your application.conf
slick.dbs.default.driver="slick.driver.PostgresDriver$"
slick.dbs.default.db.driver="org.postgresql.Driver"
slick.dbs.default.db.url="jdbc:postgresql://localhost/yourdb?user=postgres&password=postgres"
Now define your models like the following,
package yourproject.models
import play.api.db.slick.DatabaseConfigProvider
import slick.driver.JdbcProfile
case class Vote(subject: String, number: Int)
class VoteTable(tag: Tag) extends Table[Vote](tag, "votes") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def subject = column[String]("subject")
def number = column[Int]("number")
def * = (id.?, subject, number) <> (Vote.tupled, Vote.unapply)
}
class VoteRepo #Inject()()(protected val dbConfigProvider: DatabaseConfigProvider) {
val dbConfig = dbConfigProvider.get[JdbcProfile]
val db = dbConfig.db
import dbConfig.driver.api._
val Votes = TableQuery[VoteTable]
def insert(vote: Vote): DBIO[Long] = {
Votes returning Votes.map(_.id) += vote
}
}
Now your controller will look something like,
import javax.inject.Inject
import yourproject.models.{VoteRepo}
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import play.api.mvc.{Action, Controller}
class Application #Inject()(voteRepo: VoteRepo) extends Controller {
def createProject(subject: String, number: Int) = Action.async {
implicit rs => {
voteRepo.create(Vote(subject, number))
.map(id => Ok(s"project $id created") )
}
}
}