MongoDB Scala Async Driver - mongodb

I'm evaluating the MongoDB async driver:
https://github.com/mongodb/mongo-scala-driver
The API looks pretty simply, but how could I do the following transformation?
val collection: MongoCollection[Document] = database.getCollection("test")
I would like to do an implicit transformation when I do CRUD operations on the database. I could not find enough information in the reference documentation on how to do an implicit transformation.
Could anyone point me to any reference on how to do an implicit transformation?

The scala driver is only a thin wrapper around the java driver rather than a pure implementation :( ... so you have to follow the java driver's conventions and provide a codec. I don't have a working example, but you should be able to follow this: http://mongodb.github.io/mongo-java-driver/3.3/bson/codecs/ and register a codec for a scala class.

Related

How to use ReactiveMongo streaming with aggregation?

I'm using ReactiveMongo. I want to run a set of pipelines (i.e., use the Mongo aggregation framework) on a collection, and stream the results. I want to retrieve them as BSON documents.
I've seen examples that suggest something like:
coll.aggregatorContext[BSONDocument](pipelines.head, pipelines.tail).prepared[AkkaStreamCursor].cursor.documentSource()
This gets me a compilation error because I'm missing an implicit CursorProducer.Aux[BSONDocument, AkkaStreamCursor], but I have no idea how to import or construct one of these -- can't find examples. (Please don't refer me to the ReactiveMongo tests, as I can't see an example of this in them, or don't understand what I'm looking at if there is one.)
I should add that I'm using version 0.20.3 of ReactiveMongo.
If my issue is due to my choice of retrieving results as BSON focuments, and there's another type that would find an existing implicit CursorProducer.Aux, I'd happily switch if someone can tell me how to get this to compile?
So, IntelliJ is telling me that I'm missing an implicit for .prepared.
But, an sbt compile is telling me that my problem is that AkkaStreamCursor doesn't fulfill the type bounds of .prepared:
type arguments [reactivemongo.akkastream.AkkaStreamCursor] do not conform to method prepared's type parameter bounds [AC[_] <: reactivemongo.api.Cursor.WithOps[_]]
What ReactiveMongo type is available to use for this?

using Calcite's ReflectiveSchema from scala

I'm experimenting with calcite from scala, and trying to pass a simple scala class for creating a schema at runtime (using ReflectiveSchema), I'm having some headache.
For example, re-implementing the FoodMart JDBC Example (which works well in Java), I'm calling it as simple as new ReflectiveSchema(new Hr()), using a Hr class rewritten in scala as:
class HR {
val emps: Array[Employee] = Array(new Employee(100, "Bill"))
}
I'm experiencing an error: ...SqlValidatorException: Object 'emps' not found within 'hr'. This problem seems to be related to the fact that val fields are actually created private in bytecode from java, and the implementation in calcite seems to be able to use (by means of java reflection) only fields accessible through the .getFields() method of a class.
So I suppose this direction requires a lot more hacking than a simple my_field.setAccessible(true) or similar.
Are there any other way to construct a schema by API, avoiding reflection and the usage of JSON?
thanks in advance for any suggestion

What is the alternate for Datastax cassandra core driver DataType serialize/deserialize methods

We are using scala code run jobs from spark(1.5.2) which connects to cassandra. The new spark-cassandra-connector(1.5) depends on cassandra-driver-core-2.2.0-RC3.
DataType serialize/deserialize methods removed in 2.2.0-RC3.
What is the alternate way to serialize/deserialize?
13: error: value serialize is not a member of com.datastax.driver.core.DataType.CollectionType
[ERROR] implicit def ListString2ByteBuffer(list : List[String]): ByteBuffer =
DataType.list(DataType.text()).serialize(list.asJava, ProtocolVersion.NEWEST_SUPPORTED);
See: Upgrade guide
"DataType has no more references to TypeCodec, so methods that dealt with serialization and deserialization of data types have been removed... These methods must now be invoked on TypeCodec directly."
To obtain TypeCodec you can use something like that:
CodecRegistry.DEFAULT_INSTANCE.codecFor(myDateType)

Scala Casbah - Converting a command-line String filter to DSL filter

I have an application which streams mongo documents and I would like to add a feature which allows the end user to define custom filters for the outgoing documents. The application is written in Scala and uses the Casbah driver. Essentially I would like to pass through the entire filter string (IE, "pop" $gt 1000) and convert it to a type that collection.find() can accept.
Basically, I'm looking for something like this:
val filter = """pop" $gt 1000""" //This is passed in from the command-line
val cast = ??? (need to convert the string filter into a DSL Object)
collection.find(cast) //cast should have the value "pop" $gt 1000
I've been poking around online and in the Casbah docs but I can't find a simple way to do this.
Thanks in advance.
Currently there is no way to do that in Casbah, you'd have to write your own parser.
The nearest I can think of is to use Jongo to parse mongo shell like documents and convert them into DBObjects.
Alternatively, scala 2.11 did add JSR-223 Scripting Engine support and you do an evil eval:
import javax.script.ScriptEngineManager
val e = new ScriptEngineManager().getEngineByName("scala")
e.eval(""""pop" $gt 1000""")
If you are resorting to eval then make sure you are solving the issue in the correct way ;) And all the usual caveats apply - don't trust input from unknown sources.

Storing an object to a file

I want to save an object (an instance of a class) to a file. I didn't find any valuable example of it. Do I need to use serialization for it?
How do I do that?
UPDATE:
Here is how I tried to do that
import scala.util.Marshal
import scala.io.Source
import scala.collection.immutable
import java.io._
object Example {
class Foo(val message: String) extends scala.Serializable
val foo = new Foo("qweqwe")
val out = new FileOutputStream("out123.txt")
out.write(Marshal.dump(foo))
out.close
}
First of all, out123.txt contains many extra data and it was in a wrong encoding. My gut tells me there should be another proper way.
On the last ScalaDays Heather introduced a new library which gives a new cool mechanism for serialization - pickling. I think it's would be an idiomatic way in scala to use serialization and just what you want.
Check out a paper on this topic, slides and talk on ScalaDays'13
It is also possible to serialize to and deserialize from JSON using Jackson.
A nice wrapper that makes it Scala friendly is Jacks
JSON has the following advantages
a simple human readable text
a rather efficient format byte wise
it can be used directly by Javascript
and even be natively stored and queried using a DB like Mongo DB
(Edit) Example Usage
Serializing to JSON:
val json = JacksMapper.writeValueAsString[MyClass](instance)
... and deserializing
val obj = JacksMapper.readValue[MyClass](json)
Take a look at Twitter Chill to handle your serialization: https://github.com/twitter/chill. It's a Scala helper for the Kyro serialization library. The documentation/example on the Github page looks to be sufficient for your needs.
Just add my answer here for the convenience of someone like me.
The pickling library, which is mentioned by #4lex1v, only supports Scala 2.10/2.11 but I'm using Scala 2.12. So I'm not able to use it in my project.
And then I find out BooPickle. It supports Scala 2.11 as well as 2.12!
Here's the example:
import boopickle.Default._
val data = Seq("Hello", "World!")
val buf = Pickle.intoBytes(data)
val helloWorld = Unpickle[Seq[String]].fromBytes(buf)
More details please check here.