I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec)
}
I get the following errors:
1) No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
2) not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
What's wrong?
I'm not sure without full imports, but try to change MongoDBObject to com.mongodb.DBObject or add import com.mongodb.casbah.Imports._
Related
I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects. I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound.
Here is a minimal code example of what I'm trying to do:
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
object MyObject extends MyTrait[MySparkSchema]
Which fails with the following compilation error:
Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)
import com.sample.myexample.MyObject.encoder
object MyObject extends MyTrait[MySparkSchema] {
implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}
Which fails with the error message
MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name
One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:
class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]
This compiles and works fine, but at the expense of MyObject now being a class instead.
Question: Is it possible to provide implicit evidence for the context bounds when extending a trait? Or does the implicit evidence force me to make a constructor and use class instead?
Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.
You could do this:
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
Full Example
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
object MyObject extends MyTrait[MySparkSchema]
Im using ArgonautShapeless to define some json codecs.
When I provide the type for my codec I get StackOverflowError's but If I leave the type off it works. How can I provide the type?
My understanding of the problem is that the implicit lookup from def of[A: DecodeJson] = implicitly[DecodeJson[A]] finds my definition on the same line implicit def fooCodec: DecodeJson[Foo] and thus is recursive so breaks.
Is there some other way that will allow me to provide the type? Ideally I want to have one object in my project where I define all of the codes and they may depend on each other.
import $ivy.`com.github.alexarchambault::argonaut-shapeless_6.2:1.2.0-M4`
import argonaut._, Argonaut._
case class Foo(a: Int)
object SomeCodecs {
import ArgonautShapeless._
// this doesnt work
implicit def fooCodec: DecodeJson[Foo] = DecodeJson.of[Foo]
}
import SomeCodecs._
"""{"a":1}""".decode[Foo]
java.lang.StackOverflowError
ammonite.$sess.cmd3$SomeCodecs$.fooCodec(cmd3.sc:3)
It works if I leave the type off.
object SomeCodecs {
import ArgonautShapeless._
// this works
implicit def fooCodec = DecodeJson.of[Foo]
}
import SomeCodecs._
"""{"a":1}""".decode[Foo]
res4: Either[Either[String, (String, CursorHistory)], Foo] = Right(Foo(1))
Thanks
I am trying to use Scala Meta to write an annotation so I can generate another case class from an existing object.
But when I try to do this:
MyObject.parse[Source].show[Structure]
I got this error:
Error:(5, 20) not enough arguments for method parse: (implicit convert: scala.meta.common.Convert[domain.MyObject.type,scala.meta.inputs.Input], implicit parse: scala.meta.parsers.Parse[scala.meta.Source], implicit dialect: scala.meta.Dialect)scala.meta.parsers.Parsed[scala.meta.Source].
Unspecified value parameters convert, parse, dialect.
MyObject.parse[Source].show[Structure];}
^
I am very confused because based on their tutorial, that's what I need to start with
http://scalameta.org/tutorial/#.parse[T]
How can I reflect this object to loop through all properties?
Thanks
parse[Source] parses text. You may try the following
import scala.meta._
"object MyObject".parse[Source].get.show[Syntax]
If you are creating annotation then it might look like:
#MyAnnotation
object MyObject
And in another module:
import scala.meta._
class MyAnnotation extends StaticAnnotation {
inline def apply(defn: Any): Any = meta {
defn.show[Syntax]
defn
}
}
I'm new with Salat,Casbah and MongoDB. When I've been trying to make a simple method to get all users from db,
import DAL.Instances.User.{UserDAO, User}
import com.novus.salat._
import com.novus.salat.global._
import com.novus.salat.annotations._
import com.novus.salat.dao._
import com.mongodb.casbah.Imports._
import com.mongodb.casbah.MongoConnection
object UserRepository {
def getAllUsers() = {
val userList= UserDAO.find()
userList.isEmpty match {
case true => throw new Exception("None users in your db")
case false => userList
}
}
I faced with two errors:
Error:(29, 31) No implicit view available from Unit => com
.mongodb.DBObject.
val userList= UserDAO.find()
^
Error:(29, 31) not enough arguments for method find: (implicit evidence$2: Unit => com.mongodb.DBObject)com.novus.salat.dao.SalatMongoCursor[DAL.Instances.User.User].
Unspecified value parameter evidence$2.
val userList= UserDAO.find()
^
Here is my User code:
object User {
case class User( _id: ObjectId = new ObjectId, name:String, age:Int)
object UserDAO extends SalatDAO[User, ObjectId](collection = MongoConnection()("fulltestdb")("user"))
}
I'm not sure what version of Salat you are using but if you look at the signature for find it'll give you a clue as to what the issue is:
def find[A <% DBObject](ref: A): SalatMongoCursor[ObjectType]
You need to call find with a parameter that has a view bound so that this parameter may be viewed as a DBObject. This means that an implicit conversion from A => DBObject is expected to be in scope.
In your case you aren't passing any parameter. This is being treated as Unit and so the compiler tries to find an implicit conversion from Unit => DBObject. This can't be found so compilation fails.
To fix this you're best bet is to pass in an empty DBObject, you can achieve this with MongoDBObject.empty from casbah. You could add an implicit conversion from Unit => MongoDBObject but I'd probably lean towards making it explicit where possible.
I am trying to write function that writes data to MongoDB using Casbah toolkit:
import com.mongodb.casbah.WriteConcern
import ...
def saveRecord(rec: MongoDBObject) {
mongoColl.insert(rec, WriteConcern)
}
Casbah defines WriteConcern as a Scala object. I get the following errors:
No implicit view available from Object => com.mongodb.casbah.Imports.DBObject.
not enough arguments for method insert: (implicit dbObjView: Object => com.mongodb.casbah.Imports.DBObject, implicit concern: com.mongodb.WriteConcern, implicit encoder: com.mongodb.casbah.Imports.DBEncoder)com.mongodb.casbah.Imports.WriteResult. Unspecified value parameter dbObjView.
Also when I try simply instantiate WriteConcern:
val wc:WriteConcern = WriteConcern
I get this error:
not found: type WriteConcern
I have import com.mongodb.casbah.WriteConcern in first lines of my code, why it is not found? How can I instantiate WriteConcen?
Thanks!
Add an import for the necessary implicit and insert like this:
import com.mongodb.casbah.Imports._
mongoColl.insert(rec)
On your question about not being able to instantiate WriteConcern, it's because there's a class under com.mongodb and there's an enum-like object under com.mongodb.casbah. This will work:
import com.mongodb.casbah.WriteConcern
val wc: com.mongodb.WriteConcern = WriteConcern.Normal