ReactiveMongo findOne gives ambiguous implicit values - scala

My relevant imports are:
import play.api.libs.concurrent.Execution.Implicits._
import play.api.libs.json.Jsonimport play.modules.reactivemongo.json._
import play.modules.reactivemongo.ReactiveMongoApi
import play.modules.reactivemongo.json.collection.JSONCollection
import reactivemongo.api.commands.WriteResult
import reactivemongo.extensions.json.dao.JsonDao
import reactivemongo.extensions.json.dsl.JsonDsl._
The code which causes problem is
myCollection.find(Json.obj("email" -> email)).one
gives ambiguous implicit values: both object BSONDoubleFormat in trait BSONFormats of type play.modules.reactivemongo.json.BSONDoubleFormat.type and object BSONStringFormat in trait BSONFormats of type play.modules.reactivemongo.json.BSONStringFormat.type match expected type play.api.libs.json.Reads[T] myCollection.find(Json.obj("email" -> email)).one
As I understand I need to somehow define which format object should be used. But I don't understand how this can be done. The other problem is that I'm using JSON objects not BSON's to store data in Mongo, thus I don't understand why it is complaining BSONDoubleFormat & BSONStringFormat objects.

If you look at the documentation and examples, you can see that the function is .one[T], not .one.
As you don't indicate the result type T, it cannot compile.
myCollection.find(Json.obj("email" -> email)).one[T]

Related

Scala, Tapir - implicit problem with oneOfVariantFromMatchType from tapir

I have created few own errors:
sealed trait Error
case class FirstError extends Error
case class SecondError extends Error
I added it to tapir:
val firstError = oneOfVariantFromMatchType(StatusCode.BadRequest, jsonBody[FirstError].example(FirstError("first error"))
val secondError = oneOfVariantFromMatchType(StatusCode.Unauthorized, jsonBody[SecondError].example(SecondError("second error"))
And I use it like this:
def errors: EndpointOutput.OneOf[Error, Error] = oneOf(firstError, secondError)
Added to endpoint:
endpoint.get.in(...).errorOut(errors)
But when I tried to run code, I got an error:
could not find implicit value for evidence parameter of type sttp.tapir.typelevel.MatchType[FirstError]
could not find implicit value for evidence parameter of type sttp.tapir.typelevel.MatchType[SecondError]
I don't know how to fix it. I used docs from here - https://tapir.softwaremill.com/en/latest/endpoint/oneof.html
and my code is pretty the same as in the docs. How shoould I fix this?
EDIT:
Imports:
import sttp.model.StatusCode
import sttp.tapir._
import sttp.tapir.generic.auto.SchemaDerivation
import sttp.tapir.json.circe.TapirJsonCirce
import io.circe.generic.auto._
When I used oneOfVariant I got an error for both error types:
Type FirstError is not the same as its erasure. Using a runtime-class-based check it won't be possible to verify that the input matches the desired type. Use other methods to match the input to the appropriate variant instead.
oneOfVariant(

Providing implicit evidence for context bounds on Object

I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects. I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound.
Here is a minimal code example of what I'm trying to do:
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
object MyObject extends MyTrait[MySparkSchema]
Which fails with the following compilation error:
Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)
import com.sample.myexample.MyObject.encoder
object MyObject extends MyTrait[MySparkSchema] {
implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}
Which fails with the error message
MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name
One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:
class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]
This compiles and works fine, but at the expense of MyObject now being a class instead.
Question: Is it possible to provide implicit evidence for the context bounds when extending a trait? Or does the implicit evidence force me to make a constructor and use class instead?
Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.
You could do this:
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
Full Example
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
object MyObject extends MyTrait[MySparkSchema]

How is the <> method resolved on a tuple by Slick

Linked from this question
I came across Slick's documentation and found it mandates a def * method in the definition of a table to get a mapped projection.
So the line looks like this
def * = (name, id.?).<>(User.tupled,User.unapply)
Slick example here
I see the <> method is invoked on a tuple - in this case a Tuple2. The method is defined on the case class ShapedValue in Slick's code. How do I find out the implicit method that is doing the lookup?
Here are my imports:
import scala.concurrent.Await
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration.Duration
import slick.driver.H2Driver.api._
import slick.lifted.ShapedValue
import slick.lifted.ProvenShape
So i figured that one out for myself.
The object Shape implements three traits namely ConstColumnShapeImplicits , AbstractTableShapeImplicits and TupleShapeImplicits . These three traits handle the implicit conversions concerning Shapes in Slick .
The TupleShapeImplicits houses all implicit conversion methods required to convert a Tuple to a TupleShape.
Now in the line (name, id.?, salary.?).<>(User.tupled,User.unapply) what is happening is that the the method <> has a implicit parameter of Shape
The Shape class thus comes in scope for the implicit conversion. And the TupleShapeImplicits comes into scope as well.

Scala, Casbah: MongoCollection.insert compilation errors

I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec)
}
I get the following errors:
1) No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
2) not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
What's wrong?
I'm not sure without full imports, but try to change MongoDBObject to com.mongodb.DBObject or add import com.mongodb.casbah.Imports._

Scala, Casbah: Compilation errors. How to instatiate object from external libary?

I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec, WriteConcern)
}
Casbah defines WriteConcern as a Scala object. I get the following errors:
No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
Also when I try simply instantiate WriteConcern:
val wc:WriteConcern = WriteConcern
I get this error:
not found: type WriteConcern
I have import com.mongodb.casbah.WriteConcern in first lines of my code, why it is not found? How can I instantiate WriteConcen?
Thanks!
Add an import for the necessary implicit and insert like this:
import com.mongodb.casbah.Imports._
mongoColl.insert(rec)
On your question about not being able to instantiate WriteConcern, it's because there's a class under com.mongodb and there's an enum-like object under com.mongodb.casbah. This will work:
import com.mongodb.casbah.WriteConcern
val wc: com.mongodb.WriteConcern = WriteConcern.Normal