I'm writing a generic update method to simplify save an case class change to mongodb. my model T trait has the following function:
def update(id: BSONObjectID, t: T)(implicit writer: OFormat[T]): Future[WriteResult] = {
collection.update(Json.obj("_id" -> id), t)
}
when i'm calling it, it fails with the following error:
Caused by: reactivemongo.api.commands.UpdateWriteResult:
DatabaseException['The _id field cannot be changed from {_id: ObjectId('4ec58120cd6cad6afc000001')} to {_id: "4ec58120cd6cad6afc000001"}.' (code = 16837)]
Which makes sense cause MongoDB does not allow to update the document ID even though its the same value.
I'm wondering how i would remove the _id from my case-class instance to update it in mongodb. I guess I have to tuple the instance before it is converted to BSON, but i don't know how to do that. this is my example case class:
case class User(
_id: BSONObjectID,
email: String
}
thanks
I agree with ipoteka. I would use the findAndModify command from reactive mongo. There is an example gist here that should help.
Related
I am trying to return a User record from a database using doobie, http4s, and cats. I have been stymied by the type system, which is providing the following error based on the code below:
router:
val httpRoutes = HttpRoutes.of[IO] {
case GET -> Root / "second" / id =>
val intId : Integer = Integer.parseInt(id)
//if i make thie ConnectionIO[Option[Unit]] it compiles, but returns a cats Free object
val userOption: ConnectionIO[Option[User]] = UserModel.findById(intId, transactor.transactor)
Ok(s"userOption is instance of: ${userOption.getClass} object: ${userOption.toString}")
}.orNotFound
model:
case class User(
id: Read[Integer],
username: Read[String],
email: Read[String],
passwordHash: Read[String], //PasswordHash[SCrypt],
isActive: Read[Boolean],
dob: Read[Date]
) {
// def verifyPassword(password: String) : VerificationStatus = SCrypt.checkpw[cats.Id](password, passwordHash)
}
object UserModel {
def findById[User: Read](id: Integer, transactor: Transactor[ConnectionIO]): ConnectionIO[Option[User]] = findBy(fr"id = ${id.toString}", transactor)
private def findBy[User: Read](by: Fragment, transactor: Transactor[ConnectionIO]): ConnectionIO[Option[User]] = {
(sql"SELECT id, username, email, password_hash, is_active, dob FROM public.user WHERE " ++ by)
.query[User]
.option
.transact(transactor)
}
}
Error:
Error:(35, 70) Cannot find or construct a Read instance for type:
core.model.User
This can happen for a few reasons, but the most common case is that a data
member somewhere within this type doesn't have a Get instance in scope. Here are
some debugging hints:
- For Option types, ensure that a Read instance is in scope for the non-Option
version.
- For types you expect to map to a single column ensure that a Get instance is
in scope.
- For case classes, HLists, and shapeless records ensure that each element
has a Read instance in scope.
- Lather, rinse, repeat, recursively until you find the problematic bit.
You can check that an instance exists for Read in the REPL or in your code:
scala> Read[Foo]
and similarly with Get:
scala> Get[Foo]
And find the missing instance and construct it as needed. Refer to Chapter 12
of the book of doobie for more information.
val userOption: ConnectionIO[Option[User]] = UserModel.findById(intId, transactor.transactor)
If I change the line to a ConnectionIO[Option[User] to ConnectionIO[Option[Unit]] it compiles and runs but returns a Free(...) object from the cats library which I have not been able to figure out how to parse, and I don't see why I shouldn't be able to return my case class!
also See the type declarations on the findBy and findById methods. Before I added those there was a compile error that said it found a User, but required a Read[User]. I attempted applying the same type declaration to the invocation of findById in the router, but it gave the same error provided above.
Thank you for your help in advance, and please be patient with my ignorance. I've never encountered a type system so much smarter than me!
There's a lot to unpack here...
You don't need to wrap fields in User in Read.
Parameterizing the functions with User is not necessary, since you know what type you are getting back.
Most of the time if you manually handle Read instances, you're doing something wrong. Building a Read instance is only useful for when the data you're reading doesn't directly map to your type.
Transactor is meant to be a conversion from ConnectionIO (some action over a JDBC connection) to some other monad (e.g. IO) by summoning a connection, performing the action in a transaction, and disposing of said action. Transactor[ConnectionIO] doesn't really make much sense with this, and can probably lead to deadlocks (since you will eventually try to summon a connection while you are holding onto one). Just write your DB logic in ConnectionIO, and transact the whole thing afterwards.
Integer is not used in Scala code other than to interop with Java, and Doobie doesn't have Get/Put instances for it.
In your routes you take ConnectionIO[Option[User]], and do .toString. This doesn't do what you want it to - it just turns the action you've built into a useless string, without actually evaluating it. To actually get an Option[User] you would need to evaluate your action.
Putting all of that together, we end up with a piece of code like this:
import java.util.Date
import cats.effect.IO
import doobie.{ConnectionIO, Fragment, Transactor}
import doobie.implicits._
import org.http4s.HttpRoutes
import org.http4s.dsl.io._
import org.http4s.syntax.kleisli._
def httpRoutes(transactor: Transactor[IO]) = HttpRoutes.of[IO] {
case GET -> Root / "second" / IntVar(intId) =>
UserModel.findById(intId)
.transact(transactor)
.flatMap { userOption =>
Ok(s"userOption is instance of: ${userOption.getClass} object: ${userOption.toString}")
}
}.orNotFound
final case class User(
id: Int,
username: String,
email: String,
passwordHash: String,
isActive: Boolean,
dob: Date
)
object UserModel {
def findById(id: Int): ConnectionIO[Option[User]] = findBy(fr"id = ${id.toString}")
private def findBy(by: Fragment): ConnectionIO[Option[User]] =
(sql"SELECT id, username, email, password_hash, is_active, dob FROM public.user WHERE " ++ by)
.query[User]
.option
}
userOption here is Option[User].
I'm using reactivemongo in my Play Framework App and I noticed that all documents represented as for example
{name: "Robert", age: 41 }
are stored in MongoDB as
{_id: { $oid:"574005977e356b7310bcdc8d"}, name: "Robert", age: 41 }
and that's fine. That's the method I use to save the documents
// Scala code
def save(document: JsObject)
(implicit ec: ExecutionContext): Future[WriteResult] = {
collection.insert(document)
}
The latter representation is also what I get when I fetch the same document from the DB, using this method:
def find(query: JsObject, queryOptions: QueryOpts, projection: JsObject,
sort: JsObject, pageSize: Int)
(implicit ec: ExecutionContext): Future[List[JsObject]] = {
collection.find(query, projection)
.options(queryOptions)
.sort(sort)
.cursor[JsObject](ReadPreference.primaryPreferred)
.collect[List](pageSize)
}
but in this case I'd like to get a representation like
{_id: "574005977e356b7310bcdc8d", name: "Robert", age: 41 }
in order to send the documents to the requesting client via my ReSTful API. How can I get this?
You can use Json transformers: Case 6: Prune a branch from input JSON
...
.collect[List](pageSize)
.map(JsArray)
.map(
_.transform(
Reads.list(
(__\"_id").json.prune
)
)
)
... tranform JsResult to your needs
First let me say that the following representation is not what is stored in the MongoDB, which is using BSON, but what is serialized using the JSON extended syntax (see JSON documentation for ReactiveMongo).
Then, when using .cursor[T] on a query build, you are free to provide a custom document reader (in the implicit scope).
When using the JSON serialization pack, it means providing the appropriate Reads[T].
I would also add that the function .find and .save are essentially what is already done by the ReactiveMongo Collection API.
I am currently integrating part of our system with MongoDB and we decided to use the official scala driver for it.
We have case class with joda.DateTime as parameters:
case class Schema(templateId: Muid,
createdDate: DateTime,
updatedDate: DateTime,
columns: Seq[Column])
We also defined format for it:
implicit lazy val checklistSchemaFormat : Format[Schema] = (
(__ \ "templateId").format[Muid] and
(__ \ "createdDate").format[DateTime] and
(__ \ "updatedDate").format[DateTime] and
(__ \ "columns").format[Seq[Column]]
)((Schema.apply _), unlift(Schema.unapply))
When I serialize this object to json and write to mongo, the createdDate and updatedDate getting converted to Long (which is technically fine). And this is how we do it:
val bsonDoc = Document(Json.toJson(schema).toString())
collection(DbViewSchemasCollectionName).replaceOne(filter, bsonDoc, new UpdateOptions().upsert(true)).subscribe(new Observer[UpdateResult] {
override def onNext(result: UpdateResult): Unit = logger.info(s"Successfully updates checklist template schema with result: $result")
override def onError(e: Throwable): Unit = logger.info(s"Failed to update checklist template schema with error: $e")
override def onComplete(): Unit = {}
})
as a result Mongo has this type of object:
{
"_id": ObjectId("56fc4247eb3c740d31b04f05"),
"templateId": "gaQP3JIB3ppJtro9rO9BAw",
"createdDate": NumberLong(1459372615507),
"updatedDate": NumberLong(1459372615507),
"columns": [
...
]
}
Now, I am trying to read it like so:
collection(DbViewSchemasCollectionName).find(filter).first().head() map { document =>
ChecklistSchema.checklistSchemaFormat reads Json.parse(document.toJson()) match {
case JsSuccess(value, _) => {
Some(value)
}
case JsError(e) => throw JsResultException(e)
}
} recover { case e =>
logger.info("ERROR " + e)
None
}
And at this point the reads always failing, since the createdDate and updatedDate now look like this:
"createdDate" : { "$numberLong" : "1459372615507" }, "updatedDate" :
{"$numberLong" : "1459372615507" }
How do I deal with this situation? Is there any easier conversion between bson.Document and JsObject? Or I am completely digging into the wrong direction...
Thanks,
You can use the following approach to resolve your issue.
Firstly, I used json4s for reading the writing json to case classes
example:
case class User(_id: Option[Int], username: String, firstName: String, createdDate: DateTime , updatedDate: DateTime )
// A small wapper to convert case class to Document
def toBson[A <: AnyRef](x : A):Document = {
val json = write[A](x)
Document(json) }
def today() = DateTime.now
val user = User(Some(212),"binkabir","lacmalndl", today , today)
val bson = toBson(user)
usersCollection.insertOne(bson).subscribe((x: Completed) => println(x))
val f = usersCollection.find(equal("_id",212)).toFuture()
f.map(_.head.toJson).map( x => read[User](x)).foreach(println)
The code above will create a user case class, convert to Document, save to mongo db, query the db and print the returned User case class
I hope this makes sense!
To answer your second question (bson.Document <-> JsObject) - YES; this is a solved problem, check out Play2-ReactiveMongo, which makes it seem like you're storing/retrieving JsObject instances - plus it's fully asynchronous and super-easy to get going in a Play application.
You can even go a step further and use a library like Mondrian (full disclosure: I wrote it!) to get the basic CRUD operations on top of ReactiveMongo for your Play-JSON domain case-classes.
Obviously I'm biased, but I think these solutions are a great fit if you've already defined your models as case classes in Play - you can forget about the whole BSONDocument family and stick to Json._ and JsObject etc that you already know well.
EDIT:
At the risk of further downvotes, I will demonstrate how I would go about storing and retrieving the OP's Schema object, using Mondrian. I'm going to show pretty-much everything for completeness; bear in mind you've actually already done most of this. Your final code will have fewer lines than your current code, as you'd expect when you use a library.
Model Objects
There's a Column class here that's never mentioned, for simplicity let's just say that's:
case class Column (name:String, width:Int)
Now we can get on with the Schema, which is just:
import com.themillhousegroup.mondrian._
case class Schema(_id: Option[MongoId],
createdDate: DateTime,
updatedDate: DateTime,
columns: Seq[Column]) extends MongoEntity
So far, we've just implemented the MongoEntity trait, which just required the templateId field to be renamed and given the required type.
JSON Converters
import com.themillhousegroup.mondrian._
import play.api.libs.json._
import play.api.libs.functional.syntax._
object SchemaJson extends MongoJson {
implicit lazy val columnFormat = Json.format[Column]
// Pick one - easy:
implicit lazy val schemaFormat = Json.format[Schema]
// Pick one - backwards-compatible (uses "templateId"):
implicit lazy val checklistSchemaFormat : Format[Schema] = (
(__ \ "templateId").formatNullable[MongoId] and
(__ \ "createdDate").format[DateTime] and
(__ \ "updatedDate").format[DateTime] and
(__ \ "columns").format[Seq[Column]]
)((Schema.apply _), unlift(Schema.unapply))
}
The JSON converters are standard Play-JSON stuff; we pick up the MongoId Format by extending MongoJson. I've shown two different ways of defining the Format for Schema. If you have clients out in the wild using templateId (or if you prefer it) then use the second, more verbose declaration.
Service layer
For brevity I'll skip the application-configuration, you can read the Mondrian README.md for that. Let's define the SchemaService that is responsible for persistence operations on Schema instances:
import com.themillhousegroup.mondrian._
import SchemaJson._
class SchemaService extends TypedMongoService[Schema]("schemas")
That's it. We've linked the model object, the name of the MongoDB collection ("schemas") and (implicitly) the necessary converters.
Saving a Schema and finding a Schema based on some criteria
Now we start to realize the value of Mondrian. save and findOne are standard operations - we get them for free in our Service, which we inject into our controllers in the standard way:
class SchemaController #Inject (schemaService:SchemaService) extends Controller {
...
// Returns a Future[Boolean]
schemaService.save(mySchema).map { saveOk =>
...
}
...
...
// Define a filter criteria using standard Play-JSON:
val targetDate = new DateTime()
val criteria = Json.obj("createdDate" -> Json.obj("$gt" ->targetDate.getMillis))
// Returns a Future[Option[Schema]]
schemaService.findOne(criteria).map { maybeFoundSchema =>
...
}
}
So there we go. No sign of the BSON family, just the Play JSON that, as you say, we all know and love. You'll only need to reach for the Mongo documentation when you need to construct a JSON query (that $gt stuff) although in some cases you can use Mondrian's overloaded findOne(example:Schema) method if you are just looking for a simple object match, and avoid even that :-)
I'm very new to Scala, Play Framework and Squeryl. I already understand the concepts of val and var, but I'm having a hard time on trying to model my entities. As I saw on Squeryl documentation, sometimes they use var on id and other times use val. What is the best approach for id and other values(sometimes they use var/val and other times Option, last one is only for nullable fields on entities)?
Example 1
class Playlist(var id: Long,
var name: String,
var path: String) extends KeyedEntity[Long] {
}
Example 2
class Author(val id: Long,
val firstName: String,
val lastName: String,
val email: Option[String]) {
def this() = this(0,"","",Some(""))
}
And why sometimes they extend the KeyedEntity[T] and sometimes don't?
I really appreciate some help!
In Squeryl 0.9.5, all entities needed to extend KeyedEntity[T] however with 0.9.6 you can provide the KeyedEntityDef implicitly. See this for an example.
Option[T] is used when the field can contain null values. When the field is null, None is returned.
As for val vs. var it is exactly as with any other class in Scala. var allows for reassignment whereas val is, more or less, read-only. If you are going to change values, a lot of people simply make the field a var. Alternately, if you are using a case class you can use copy to create a new object with updated values or you can update the value via reflection.
My database looks like
[
{
name: "domenic",
records: {
today: 5,
yesterday: 1.5
}
},
{
name: "bob",
records: { ... }
}
]
When I try queries like
val result: Option[DBObject] = myCollection.findOne(
MongoDBObject("name" -> "domenic")
MongoDBObject("records" -> 1),
)
val records = result.get.getAs[BasicDBObject]("records").get
grater[Map[String, Number]].asObject(records)
it fails (at runtime!) with
GRATER GLITCH - unable to find or instantiate a grater using supplied path name
REASON: Class scala.collection.immutable.Map is an interface
Context: 'global'
Path from pickled Scala sig: 'scala.collection.immutable.Map'
I think I could make this work by creating a case class whose only field is a Map[String, Number] and then getting its property. Is that really necessary?
grater doesn't take a collection as a type argument, only a case class or a trait/abstract class whose concrete representations are case classes. Since you're just querying for a map, just extract the values you need out of the DBObject using getAs[T].
Number may not be a supported type in Salat - I've certainly never tried it. If you need Number you can write a custom transformer or send a pull request to add real support to Salat.