BSONObjectIDFormat in trait BSONFormats is deprecated - mongodb

I am using Reactive Mongo version 0.11.11 and I want to implement a method in my DAO which counts all documents by _id.
Here is my DAO:
import com.google.inject.Inject
import models.auth.{Team, Player}
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import play.api.libs.json._
import play.modules.reactivemongo.ReactiveMongoApi
import play.modules.reactivemongo.json._
import reactivemongo.bson._
import reactivemongo.play.json.collection.JSONCollection
import scala.concurrent.Future
trait TeamDao {
def find(_id: BSONObjectID): Future[Option[Team]]
def find(name: String): Future[Option[Team]]
def save(team: Team): Future[Team]
def link(player: Player, team: Team): Future[Team]
def update(team: Team): Future[Team]
def count(team: Option[Team] = None): Future[Int]
def count(_id: BSONObjectID): Future[Int]
def countAllPlayersWithTeam(team: Team): Future[Int]
}
class MongoTeamDao #Inject()(reactiveMongoApi: ReactiveMongoApi) extends TeamDao {
val players = reactiveMongoApi.db.collection[JSONCollection]("players")
val teams = reactiveMongoApi.db.collection[JSONCollection]("teams")
def find(_id: BSONObjectID): Future[Option[Team]] = teams.find(BSONDocument("_id" -> _id)).one[Team]
def find(name: String): Future[Option[Team]] = teams.find(Json.obj("name" -> name)).one[Team]
def save(team: Team): Future[Team] = teams.insert(team).map(_ => team)
def link(player: Player, team: Team) = for {
_ <- players.update(Json.obj("_id" -> player.id), Json.obj("$push" -> BSONDocument("teams" -> team._id)))
team <- find(team._id.get)
} yield team.get
def update(team: Team) = for {
_ <- teams.update(BSONDocument("_id" -> team._id), BSONDocument("$set" -> BSONDocument("name" -> team.name)))
team <- find(team._id.get)
} yield team.get
def count(team: Option[Team] = None): Future[Int] = {
val tmpTeam: Team = team.getOrElse {
return teams.count()
}
teams.count(Some(Json.obj("name" -> tmpTeam.name)))
}
def count(_id: BSONObjectID): Future[Int] = {
teams.count(Some(Json.obj("_id" -> _id)))
}
def countAllPlayersWithTeam(team: Team): Future[Int] = {
players.count(Some(Json.obj("teams" -> team._id)))
}
}
The problem is that I get the following error:
value BSONObjectIDFormat in trait BSONFormats is deprecated: Use [[reactivemongo.play.json.BSONFormats.BSONObjectIDFormat]]
[error] teams.count(Some(Json.obj("_id" -> _id)))
I tried to replace the count method with:
def count(_id: BSONObjectID): Future[Int] = {
teams.count(Some(BSONDocument("_id" -> _id)))
}
But then I get the following compile error:
[error] found : reactivemongo.bson.BSONDocument
[error] required: MongoTeamDao.this.teams.pack.Document
[error] (which expands to) play.api.libs.json.JsObject
[error] Error occurred in an application involving default arguments.
[error] teams.count(Some(BSONDocument("_id" -> _id)))

You are mixing JSONCollection and BSON values.
It's recommanded that you either use the JSON serialization with JSONCollection, or you use the default BSON serialization with BSONCollection.
The deprecation message is a warning indicating to use the separate JSON library, instead of the former types previously included in the Play plugin.
A BSONCollection can be resolved from the Play plugin as follows.
reactiveMongoApi.database.map(_.collection[BSONCollection]("players"))
The functions MongoConnection.(db|apply) and/or ReactiveMongoApi.db are deprecated, and the equivalent .database must be used (which returns Future[DefaultDB] instead of DefaultDB).

Related

How to properly use IO and OptionT in service layer in for-comprehension?

I have a simple repository interface with CRUD operations (probably, it is a bad idea to pass implicit session as parameter in general trait):
trait Repository[Entity, PK] {
def find(pk: PK)(implicit session: DBSession): OptionT[IO, Entity]
def insert(e: Entity)(implicit session: DBSession): IO[Entity]
def update(e: Entity)(implicit session: DBSession): IO[Entity]
def delete(pk: PK)(implicit session: DBSession): IO[Int]
def findAll()(implicit session: DBSession): IO[List[Entity]]
}
And i want to use it like this:
for {
_ <- repository.insert(???)
_ <- repository.delete(???)
v <- repository.find(???).value
_ <- someFunctionReliesOnReturnedValue(v)
} yield (???)
Also, i want to stop execution if v is None and rollback transaction if there is any error (i use scalikejdbc). So, as i think, i have to do it in my service layer like this (+ wrap it into Try or something like this to cacth business exception):
def logic(???) = {
DB localTx {
implicit session => {
(for {
_ <- repository.insert(???)
_ <- repository.delete(???)
v <- repository.find(???).value
_ <- someFunctionReliesOnReturnedValue(v)
} yield (???)).unsafeRunSync() // to rollback transaction if there is any error
}
}
}
The problem is here: someFunctionReliesOnReturnedValue(v). It can be an arbitrary function which accepts Entity not Option[Entity]. How can i convert result of OptionT[IO, Entity] to IO[Entity] and save semantic of Option[]?
Is it correct approach or i've mistaken somewhere?
import java.nio.file.{Files, Paths}
import cats.data.OptionT
import cats.effect.IO
import scalikejdbc._
import scala.util.Try
case class Entity(id: Long, value: String)
object Entity extends SQLSyntaxSupport[Entity] {
override def tableName: String = "entity"
override def columnNames: Seq[String] = Seq("id", "value")
def apply(g: SyntaxProvider[Entity])(rs: WrappedResultSet): Entity = apply(g.resultName)(rs)
def apply(r: ResultName[Entity])(rs: WrappedResultSet): Entity =
Entity(rs.long(r.id), rs.string(r.value))
}
trait Repository[Entity, PK] {
def find(pk: PK)(implicit session: DBSession): OptionT[IO, Entity]
def insert(e: Entity)(implicit session: DBSession): IO[Entity]
}
class EntityRepository extends Repository[Entity, Long] {
private val alias = Entity.syntax("entity")
override def find(pk: Long)(implicit session: DBSession): OptionT[IO, Entity] = OptionT{
IO{
withSQL {
select(alias.resultAll).from(Entity as alias).where.eq(Entity.column.id, pk)
}.map(Entity(alias.resultName)(_)).single().apply()
}
}
override def insert(e: Entity)(implicit session: DBSession): IO[Entity] = IO{
withSQL {
insertInto(Entity).namedValues(
Entity.column.id -> e.id,
Entity.column.value -> e.value,
)
}.update().apply()
e
}
}
object EntityRepository {
def apply(): EntityRepository = new EntityRepository()
}
object Util {
def createFile(value: String): IO[Unit] = IO(Files.createDirectory(Paths.get("path", value)))
}
class Service {
val repository = EntityRepository()
def logic(): Either[Throwable, Unit] = Try {
DB localTx {
implicit session => {
val result: IO[Unit] = for {
_ <- repository.insert(Entity(1, "1"))
_ <- repository.insert(Entity(2, "2"))
e <- repository.find(3)
_ <- Util.createFile(e.value) // error
//after this step there is possible more steps (another insert or find)
} yield ()
result.unsafeRunSync()
}
}
}.toEither
}
object Test extends App {
ConnectionPool.singleton("jdbc:postgresql://localhost:5433/postgres", "postgres", "")
val service = new Service()
service.logic()
}
Table:
create table entity (id numeric(38), value varchar(255));
And i got compile error:
Error:(69, 13) type mismatch; found : cats.effect.IO[Unit]
required: cats.data.OptionT[cats.effect.IO,?]
_ <- Util.createFile(e.value)
In general, you should convert all of your different results to your "most general" type that has a monad. In this case, that means you should use OptionT[IO, A] throughout your for-comprehension by converting all of those IO[Entity] to OptionT[IO, Entity] with OptionT.liftF:
for {
_ <- OptionT.liftF(repository.insert(???))
_ <- OptionT.liftF(repository.delete(???))
v <- repository.find(???)
_ <- someFunctionReliesOnReturnedValue(v)
} yield (???)
If you had an Option[A] you could use OptionT.fromOption[IO]. The issues come from trying to mix monads within the same for-comprehension.
This will already stop execution if any of these result in a None. As for rolling back the transaction, that depends on how your DB interaction library works, but if it handles exceptions by rolling back, then yes, unsafeRunSync will work. If you also want it to roll back by throwing an exception when the result is None, you could do something like:
val result: OptionT[IO, ...] = ...
result.value.unsafeRunSync().getOrElse(throw new FooException(...))

Avro4s, how to serialise a map with custom key type?

I am using Avro4s. It's easy to serialise a
Map[String, T]
but I have a situation like
sealed trait Base
case object First extends Base
case object Second extends Base
and I need to serialise something like
Map[Base, T]
Any advice on the best way to achieve this? Thanks.
The thing is that according to the Avro spec
Map keys are assumed to be strings.
So the only type supported by Avro is Map[String,T]. It means that you need to write some custom code that will map your Map[Base, T] onto Map[String,T] and back. Something like this will probably work for you:
import scala.collection.breakOut
import scala.collection.immutable.Map
import scala.collection.JavaConverters._
import com.sksamuel.avro4s._
import org.apache.avro.Schema
import org.apache.avro.Schema.Field
object BaseMapAvroHelpers {
private val nameMap: Map[Base, String] = Map(First -> "first", Second -> "second")
private val revNameMap: Map[String, Base] = nameMap.toList.map(kv => (kv._2, kv._1)).toMap
implicit def toSchema[T: SchemaFor]: ToSchema[Map[Base, T]] = new ToSchema[Map[Base, T]] {
override val schema: Schema = Schema.createMap(implicitly[SchemaFor[T]].apply())
}
implicit def toValue[T: SchemaFor : ToValue]: ToValue[Map[Base, T]] = new ToValue[Map[Base, T]] {
override def apply(value: Map[Base, T]): java.util.Map[String, T] = value.map(kv => (nameMap(kv._1), kv._2)).asJava
}
implicit def fromValue[T: SchemaFor : FromValue]: FromValue[Map[Base, T]] = new FromValue[Map[Base, T]] {
override def apply(value: Any, field: Field): Map[Base, T] = {
val fromValueS = implicitly[FromValue[String]]
val fromValueT = implicitly[FromValue[T]]
value.asInstanceOf[java.util.Map[Any, Any]].asScala.map(kv => (revNameMap(fromValueS(kv._1)), fromValueT(kv._2)))(breakOut)
}
}
}
Usage example:
case class Wrapper[T](value: T)
def test(): Unit = {
import BaseMapAvroHelpers._
val map: Map[Base, String] = Map(First -> "abc", Second -> "xyz")
val wrapper = Wrapper(map)
val schema = AvroSchema[Wrapper[Map[Base, String]]]
println(s"Schema: $schema")
val bufOut = new ByteArrayOutputStream()
val out = AvroJsonOutputStream[Wrapper[Map[Base, String]]](bufOut)
out.write(wrapper)
out.flush()
println(s"Avro Out: ${bufOut.size}")
println(bufOut.toString("UTF-8"))
val in = AvroJsonInputStream[Wrapper[Map[Base, String]]](new ByteArrayInputStream(bufOut.toByteArray))
val read = in.singleEntity
println(s"read: $read")
}
and the output is something like:
Schema: {"type":"record","name":"Wrapper","namespace":"so","fields":[{"name":"value","type":{"type":"map","values":"string"}}]}
Avro Out: 40
{"value":{"first":"abc","second":"xyz"}}
read: Success(Wrapper(Map(First -> abc, Second -> xyz)))

Scala, Sangria and Scalatra

We have a Scala application using Scalatra (http://scalatra.org/) as our web framework. I'm wondering if there are any good (or just any) resources out there on how to implement a GraphQL endpoint using Sangria (http://sangria-graphql.org/) and Scalatra?
I'm new to Scala and would appreciate any help to get started on this.
There aren't any that I know of but since Scalatra uses json4s you would use sangria's json4s marshaller .
Otherwise, if sangria could be clearer to you, here's a scala worksheet with a very simplistic example based off play + sangria - in this case you would just need to swap the json library.
The db is mocked (perhaps you use Slick?) and the http server as well but it's a simple case of swapping in the function definitions.
import sangria.ast.Document
import sangria.execution.{ErrorWithResolver, Executor, QueryAnalysisError}
import sangria.macros.derive.{ObjectTypeDescription, ObjectTypeName, deriveObjectType}
import sangria.parser.{QueryParser, SyntaxError}
import sangria.renderer.SchemaRenderer
import sangria.schema.{Argument, Field, IntType, ListType, ObjectType, OptionInputType, Schema, fields}
import scala.concurrent.Await
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.Future
import scala.util.{Failure, Success}
// replace with another json lib
// eg https://github.com/sangria-graphql/sangria-json4s-jackson
import play.api.libs.json._
import sangria.marshalling.playJson._
case class User(name: String, age: Int, phone: Option[String])
class FakeDb {
class UsersTable {
def getUsers(limit: Int): List[User] = {
// this would come from the db
List(
User("john smith", 23, None),
User("Anne Schwazenbach", 45, Some("2134556"))
)
}
}
val usersRepo = new UsersTable
}
object MySchema {
val limitArg: Argument[Int] = Argument("first", OptionInputType(IntType),
description = s"Returns the first n elements from the list.",
defaultValue = 10)
implicit val UsersType: ObjectType[FakeDb, User] = {
deriveObjectType[FakeDb, User](
ObjectTypeName("Users"),
ObjectTypeDescription("Users in the system")
)
}
private val Query: ObjectType[FakeDb, Unit] = ObjectType[FakeDb, Unit](
"Query", fields[FakeDb, Unit](
Field("users", ListType(UsersType),
arguments = limitArg :: Nil,
resolve = c => c.ctx.usersRepo.getUsers(c.arg(limitArg))
)
))
val theSchema: Schema[FakeDb, Unit] = Schema(Query)
}
object HttpServer {
def get(): String = {
// Http GET
SchemaRenderer.renderSchema(MySchema.theSchema)
}
def post(query: String): Future[JsValue] = {
// Http POST
val variables = None
val operation = None
QueryParser.parse(query) match {
case Success(q) => executeQuery(q, variables, operation)
case Failure(error: SyntaxError) => Future.successful(Json.obj("error" -> error.getMessage))
case Failure(error: Throwable) => Future.successful(Json.obj("error" -> error.getMessage))
}
}
private def executeQuery(queryAst: Document, vars: Option[JsValue], operation: Option[String]): Future[JsValue] = {
val schema: Schema[FakeDb, Unit] = MySchema.theSchema
Executor.execute[FakeDb, Unit, JsValue](schema, queryAst, new FakeDb,
operationName = operation,
variables=vars.getOrElse(Json.obj()))
.map((d: JsValue) => d)
.recover {
case error: QueryAnalysisError ⇒ Json.obj("error" -> error.getMessage)
case error: ErrorWithResolver ⇒ Json.obj("error" -> error.getMessage)
}
}
}
HttpServer.get()
val myquery = """
{
users {
name
}
}
"""
val res: JsValue = Await.result(HttpServer.post(myquery), 10.seconds)

Scala macros for nested case classes to Map and other way around

I want to convert any case class to a Map[String,Any] for example:
case class Person(name:String, address:Address)
case class Address(street:String, zip:Int)
val p = Person("Tom", Address("Jefferson st", 10000))
val mp = p.asMap
//Map("name" -> "Tom", "address" -> Map("street" -> "Jefferson st", "zip" -> 10000))
val p1 = mp.asCC[Person]
assert(p1 === p)
Possible duplications:
Here is a question that with reflection answer.
Here is a question for (converting from case class to map (without nesting)
I also found how to do it for a case claas without any nested case class inside it, here is the code from here:
package macros
import scala.language.experimental.macros
import scala.reflect.macros.blackbox.Context
trait Mappable[T] {
def toMap(t: T): Map[String, Any]
def fromMap(map: Map[String, Any]): T
}
object Mappable {
implicit def materializeMappable[T]: Mappable[T] = macro materializeMappableImpl[T]
def materializeMappableImpl[T: c.WeakTypeTag](c: Context): c.Expr[Mappable[T]] = {
import c.universe._
val tpe = weakTypeOf[T]
val companion = tpe.typeSymbol.companion
val fields = tpe.decls.collectFirst {
case m: MethodSymbol if m.isPrimaryConstructor => m
}.get.paramLists.head
val (toMapParams, fromMapParams) = fields.map { field =>
val name = field.asTerm.name
val key = name.decodedName.toString
val returnType = tpe.decl (name).typeSignature
(q"$key -> t.$name", q"map($key).asInstanceOf[$returnType]")
}.unzip
c.Expr[Mappable[T]] { q"""
new Mappable[$tpe] {
def toMap(t: $tpe): Map[String, Any] = Map(..$toMapParams)
def fromMap(map: Map[String, Any]): $tpe = $companion(..$fromMapParams)
}
""" }
}
}
Also it worth to mention Play Json library and ReactiveMongo Bson library do the same thing, but those project were really big to understand how to do this.

trying to stream tweets with twitter4j3.0.3

I am trying to stream tweets with twitter4j3.0.3 with scala but it gives me these errors.
Here is my code:
import twitter4j._
import ch.qos.logback.core.status.StatusListener
import twitter4j.conf.ConfigurationBuilder
import ch.qos.logback.core.status
object stream {
def main(args: Array[String]) {
val cb: ConfigurationBuilder = new ConfigurationBuilder
cb.setDebugEnabled(true)
.setOAuthConsumerKey("1")
.setOAuthConsumerSecret("1")
.setOAuthAccessToken("1")
.setOAuthAccessTokenSecret("1")
def simpleStatusListener:StatusListener =new StatusListener() {
def addStatusEvent(status: Status) {println(x = status.getText)}
def onStatus(status: Status) { println(x = status.getText) }
def onDeletionNotice(statusDeletionNotice: StatusDeletionNotice) {}
def onTrackLimitationNotice(numberOfLimitedStatuses: Int) {}
def onException(ex: Exception) { ex.printStackTrace }
def onScrubGeo(arg0: Long, arg1: Long) {}
def onStallWarning(warning: StallWarning) {}
}
val twitterStream:TwitterStream= new TwitterStreamFactory(cb.build).getInstance()
twitterStream.addListener(simpleStatusListener)
twitterStream.sample()
}
}
and the error:
overloaded method value addListener with alternatives:
(twitter4j.RawStreamListener)Unit
(twitter4j.SiteStreamsListener)Unit
(twitter4j.StatusListener)Unit
(twitter4j.UserStreamListener)Unit
cannot be applied to (ch.qos.logback.core.status.StatusListener)
twitterStream.addListener(simpleStatusListener)
^
You're importing the wrong StatusListener interface.
Instead of
import ch.qos.logback.core.status.StatusListener
You need
import twitter4j.StatusListener