Casbah Scala MongoDB driver - a strange error - scala

I am trying to use Casbah, I get a strange error right in the beginning, on this line:
val mongoDB = MongoConnection("MyDatabase")
the error on MongoConenction says:
class file needed by MongoConnection is missing. reference type
MongoOptions of package com.mongodb refers to nonexisting symbol.
I do not know what to do with this. The jars that I have attached to my projects are:
casbah-commons_2.9.1-3.0.0-SNAPSHOT.jar
casbah-core_2.9.1-3.0.0-SNAPSHOT.jar
casbah-gridfs_2.9.1-3.0.0-SNAPSHOT.jar
casbah-query_2.9.1-3.0.0-SNAPSHOT.jar
casbah-util_2.9.1-3.0.0-SNAPSHOT.jar
which looks like a full setup of Casbah and I do not understand what it might be yearning for. So there is the question number one - what do I have to do to resolve this problem?
The question number two is - the Casbah tutorial says that I could import just one thing, and get the mongoConn() method, which is also not truth. The mongoConn() simply does not get found if I follow the instructions. So, how can I acheive that everythong works as in the tutorial?

I don't know the details of your setup, but it seems like you are not referencing the dependencies of the casbah-commons module.
According to the docs, those are:
mongo-java-driver, scalaj-collection, scalaj-time, JodaTime, slf4j-api

Related

Spark cannot find case class on classpath

I have an issue where Spark is failing to generate code for a case class. Here is the spark error
Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 52, Column 43: Identifier expected instead of '.'
Here is the referenced line in the generated code
/* 052 */ private com.avro.message.video.public.MetricObservation MapObjects_loopValue34;
It should be noted that com.avro.message.video.public.MetricObservation is a nested case class in part of a larger hierarchy. It is also used in other places in the code fine. It should also be noted that this pipeline works fine if I use the RDD API, but I want to use the Dataset API because I want to write out the Dataset in parquet. Has anyone seen this issue before?
I'm using Scala 2.11 and Spark 2.1.0. I was able to upgrade to Spark 2.2.1 and the issue is still there.
Do you think that SI-7555 or something like it has any bearing on this? I have noticed the past that Scala reflection has had issues generating TypeTags for statically nested classes. Do you think something like that is going on or is this strictly a catalyst issue in spark? You might want to file a spark ticket too.
So it turns out that changing the package name of the affect class "fixes" (ie made go away) the problem. I really have no idea why this is or even how to reproduce it in a small test case. What worked for me was I just created a higher level package that work. Specifically com.avro.message.video.public -> com.avro.message.publicVideo.

Play! Framework Templating Engine issues importing long class names

I've got a List of classes I want to send down to a Scala Template in Play! Framework 2.2.3
however I ran into some issues while trying to do so.
The class I want the list to contain is an arbitrary class type that comes from a package outside of my workspace, but not natively from Java. See the picture below.
Note: I do not have a project/Build.scala file.
The above image represents the first line in my scala template, I have tried to use #import as well (#import com.***.***.type._, com.***.***.type.Version, etc) but to no avail.
This is the error message given to me by Play! Framework.
Is there an issue with the namespacing? Everything works fine when using classes located in my workspace.
The Paths are correct, I've double checked that. For reasons I cannot disclose more code in this region, if more information is required please ask for it and I'll edit the post.
The problem is related to package named type. This word is reserved in Scala as language keyword. You need to escape it like this:
#import List[com.your.package.`type`.Version]

Problems compiling routes after migrating to Play 2.1

After migrating to Play-2.1 I stuck into problem that routes compiler stopped working for my routes file. It's been completely fine with Play-2.0.4, but now I'm getting the build error and can't find any workaround for it.
In my project I'm using cake pattern, so controller actions are visible not through <package>.<controller class>.<action>, but through <package>.<component registry>.<controller instance>.<action>. New Play routes compiler is using all action path components except for the last two to form package name that will be used in managed sources (as far as I can get code in https://github.com/playframework/Play20/blob/2.1.0/framework/src/routes-compiler/src/main/scala/play/router/RoutesCompiler.scala). In my case it leads to situation when <package>.<component registry> is chosen as package name, which results in error during build:
[error] server/target/scala-2.10/src_managed/main/com/grumpycats/mmmtg/componentsRegistry/routes.java:5: componentsRegistry is already defined as object componentsRegistry
[error] package com.grumpycats.mmmtg.componentsRegistry;
I made the sample project to demonstrate this problem: https://github.com/rmihael/play-2.1-routes-problem
Is it possible to workaround this problem somehow without dropping cake pattern for controllers? It's the pity that I can't proceed with Play 2.1 due to this problem.
Because of reputation I can not create a comment.
The convention is that classes and objects start with upper case. This convention is applied to pattern matching as well. Looking at a string there seems to be no difference between a package object and normal object (appart from the case). I am not sure how Play 2.1 handles things, that's why this is not an answer but a comment.
You could try the new # syntax in the router. That allows you to create an instance from the Global class. You would still specify <package>.<controller class>.<action>, but in the Global you get it from somewhere else (for example a component registry).
You can find a bit of extra information here under the 'Managed Controller classes instantiation': http://www.playframework.com/documentation/2.1.0/Highlights
This demo project shows it's usage: https://github.com/guillaumebort/play20-spring-demo

Unable to find Within method using Akka Teskit

I see lots of documents referencing the "Within" method on the TestKit.
There appears to be something wrong with my tooling because I'm not able to import that method.
Examples of usage :
http://doc.akka.io/docs/akka/2.0.2/scala/testing.html
Should be available as of at least 2.0
http://doc.akka.io/api/akka/2.0/akka/testkit/TestKit.html

MongoDB Java / Scala drivers - Missing methods

I'm trying to convert a persistence layer from a plain old database (using ScalaQuery) to MongoDB, and I'm running into an odd issue. I use the Casbah driver, which is a Scala wrapper around the official MongoDB Java driver. Both the Java and Scala driver define - according to the docs and the overview of the .jar when I open it in Eclipse - a method findOneById that takes a single DBObject as parameter (with an ID in it).
However, when I try to access it, I get a missing method exception from the Scala compiler, both in Eclipse and SBT - Scala version 2.9.0-1, SBT 0.10.1.
What might cause this? Is this perhaps a known SBT / Scala compiler bug?
I just removed my entire repository so all dependencies get downloaded freshly, but this didn't fix the problems.
Are you sure that you call findOneById on a MongoCollection instance ?
Maybe it's the parameter type that is wrong, as I can see on the documentation (http://api.mongodb.org/scala/casbah/2.1.2/scaladoc/com/mongodb/casbah/MongoCollection.html), findOneById should take an Id of type AnyRef and optionnaly the fields to return.
You should try something like mongoCollection.findOneByID(1.asInstanceOf[Object]).
Regarding BBObject, it seems that it doesn't appear in the list of parameter (except as an implicit parameter useful to convert the fields that you request to a DBObject). Maybe the signature of the method changed since a previous release.
Hope this will help.