Scala, Casbah: Compilation errors. How to instatiate object from external libary? - scala

I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec, WriteConcern)
}
Casbah defines WriteConcern as a Scala object. I get the following errors:
No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
Also when I try simply instantiate WriteConcern:
val wc:WriteConcern = WriteConcern
I get this error:
not found: type WriteConcern
I have import com.mongodb.casbah.WriteConcern in first lines of my code, why it is not found? How can I instantiate WriteConcen?
Thanks!

Add an import for the necessary implicit and insert like this:
import com.mongodb.casbah.Imports._
mongoColl.insert(rec)
On your question about not being able to instantiate WriteConcern, it's because there's a class under com.mongodb and there's an enum-like object under com.mongodb.casbah. This will work:
import com.mongodb.casbah.WriteConcern
val wc: com.mongodb.WriteConcern = WriteConcern.Normal

Related

Providing implicit evidence for context bounds on Object

I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects. I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound.
Here is a minimal code example of what I'm trying to do:
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
object MyObject extends MyTrait[MySparkSchema]
Which fails with the following compilation error:
Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)
import com.sample.myexample.MyObject.encoder
object MyObject extends MyTrait[MySparkSchema] {
implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}
Which fails with the error message
MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name
One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:
class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]
This compiles and works fine, but at the expense of MyObject now being a class instead.
Question: Is it possible to provide implicit evidence for the context bounds when extending a trait? Or does the implicit evidence force me to make a constructor and use class instead?
Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.
You could do this:
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
Full Example
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
object MyObject extends MyTrait[MySparkSchema]

How do I use Scala-Meta Parse an object?

I am trying to use Scala Meta to write an annotation so I can generate another case class from an existing object.
But when I try to do this:
MyObject.parse[Source].show[Structure]
I got this error:
Error:(5, 20) not enough arguments for method parse: (implicit convert: scala.meta.common.Convert[domain.MyObject.type,scala.meta.inputs.Input], implicit parse: scala.meta.parsers.Parse[scala.meta.Source], implicit dialect: scala.meta.Dialect)scala.meta.parsers.Parsed[scala.meta.Source].
Unspecified value parameters convert, parse, dialect.
MyObject.parse[Source].show[Structure];}
^
I am very confused because based on their tutorial, that's what I need to start with
http://scalameta.org/tutorial/#.parse[T]
How can I reflect this object to loop through all properties?
Thanks
parse[Source] parses text. You may try the following
import scala.meta._
"object MyObject".parse[Source].get.show[Syntax]
If you are creating annotation then it might look like:
#MyAnnotation
object MyObject
And in another module:
import scala.meta._
class MyAnnotation extends StaticAnnotation {
inline def apply(defn: Any): Any = meta {
defn.show[Syntax]
defn
}
}

Problems with Salat methods in MongoDB: implicit view & not enough arguments

I'm new with Salat,Casbah and MongoDB. When I've been trying to make a simple method to get all users from db,
import DAL.Instances.User.{UserDAO, User}
import com.novus.salat._
import com.novus.salat.global._
import com.novus.salat.annotations._
import com.novus.salat.dao._
import com.mongodb.casbah.Imports._
import com.mongodb.casbah.MongoConnection
object UserRepository {
def getAllUsers() = {
val userList= UserDAO.find()
userList.isEmpty match {
case true => throw new Exception("None users in your db")
case false => userList
}
}
I faced with two errors:
Error:(29, 31) No implicit view available from Unit => com
.mongodb.DBObject.
val userList= UserDAO.find()
^
Error:(29, 31) not enough arguments for method find: (implicit evidence$2: Unit => com.mongodb.DBObject)com.novus.salat.dao.SalatMongoCursor[DAL.Instances.User.User].
Unspecified value parameter evidence$2.
val userList= UserDAO.find()
^
Here is my User code:
object User {
case class User( _id: ObjectId = new ObjectId, name:String, age:Int)
object UserDAO extends SalatDAO[User, ObjectId](collection = MongoConnection()("fulltestdb")("user"))
}
I'm not sure what version of Salat you are using but if you look at the signature for find it'll give you a clue as to what the issue is:
def find[A <% DBObject](ref: A): SalatMongoCursor[ObjectType]
You need to call find with a parameter that has a view bound so that this parameter may be viewed as a DBObject. This means that an implicit conversion from A => DBObject is expected to be in scope.
In your case you aren't passing any parameter. This is being treated as Unit and so the compiler tries to find an implicit conversion from Unit => DBObject. This can't be found so compilation fails.
To fix this you're best bet is to pass in an empty DBObject, you can achieve this with MongoDBObject.empty from casbah. You could add an implicit conversion from Unit => MongoDBObject but I'd probably lean towards making it explicit where possible.

How does Scala use explicit types when resolving implicits?

I have the following code which uses spray-json to deserialise some JSON into a case class, via the parseJson method.
Depending on where the implicit JsonFormat[MyCaseClass] is defined (in-line or imported from companion object), and whether there is an explicit type provided when it is defined, the code may not compile.
I don't understand why importing the implicit from the companion object requires it to have an explicit type when it is defined, but if I put it inline, this is not the case?
Interestingly, IntelliJ correctly locates the implicit parameters (via cmd-shift-p) in all cases.
I'm using Scala 2.11.7.
Broken Code - Wildcard import from companion object, inferred type:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
}
}
Results in:
Cannot find JsonReader or JsonFormat type class for SampleAppObject.MyCaseClass
Note that the same thing happens with an explicit import of the myCaseClassSchemaFormat implicit.
Working Code #1 - Wildcard import from companion object, explicit type:
Adding an explicit type to the JsonFormat in the companion object causes the code to compile:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
//Explicit type added here now
implicit val myCaseClassSchemaFormat: JsonFormat[MyCaseClass] = jsonFormat1(MyCaseClass)
}
}
Working Code #2 - Implicits inline, inferred type:
However, putting the implicit parameters in-line where they are used, without the explicit type, also works!
import SampleApp._
import spray.json._
class SampleApp {
import DefaultJsonProtocol._
//Now in-line custom JsonFormat rather than imported
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
}
After searching for the error message Huw mentioned in his comment, I was able to find this StackOverflow question from 2010: Why does this explicit call of a Scala method allow it to be implicitly resolved?
This led me to this Scala issue created in 2008, and closed in 2011: https://issues.scala-lang.org/browse/SI-801 ('require explicit result type for implicit conversions?')
Martin stated:
I have implemented a slightly more permissive rule: An implicit conversion without explicit result type is visible only in the text following its own definition. That way, we avoid the cyclic reference errors. I close for now, to see how this works. If we still have issues we migth come back to this.
This holds - if I re-order the breaking code so that the companion object is declared first, then the code compiles. (It's still a little weird!)
(I suspect I don't see the 'implicit method is not applicable here' message because I have an implicit value rather than a conversion - though I'm assuming here that the root cause is the same as the above).

Scala, Casbah: MongoCollection.insert compilation errors

I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec)
}
I get the following errors:
1) No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
2) not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
What's wrong?
I'm not sure without full imports, but try to change MongoDBObject to com.mongodb.DBObject or add import com.mongodb.casbah.Imports._