Slick: why would I save "Unit" to my database? - scala

I'm new to Scala and Slick and was surprised by something in the Slick documentation:
The following primitive types are supported out of the box for
JDBC-based databases in JdbcProfile
...
Unit
...
I don't get why this list contains Unit. From my understanding, Unit is similar to Java's void, something I neither can save to nor receive from my database. What is the intention behind it?
edit: you can find it here.

One way to look at Slick is running Scala code on your database as the execution engine. We are working on allowing more Scala code over time. An expression that contains a unit, e.g. in a tuple is a valid Scala expression and thus should be runnable by Slick unless there is a good reason why not. So we support unit.

Related

eqs on Option[UUID] type? Phantom + Cassandra + Scala

I'm using Phantom framework to work with Cassandra and I'm trying to do a eqs on a Option field eg.
Address.select.where(_.id eqs Some(uuid)).one()
Then I get "value eqs is not a member of object"
is there a way to accomplish that? I can't figure out...
The id field is an Option[UUID], because it must be null when I'm receiving a POST request in Play Framework, but I don't know how to do this assert in phantom
I also opened an issue on github.
https://github.com/websudos/phantom/issues/173
Phantom relies on a series of implicit conversions to provide most of the functionality. A very simple way to fix most of the errors you get from compiling phantom tables is to make sure the relevant import is in scope.
Before phantom 1.7.0
import com.websudos.phantom.Implicits._
After 1.7.0
import com.websudos.phantom.dsl._
Beyond the implicit mechanism, phantom will also help you with aliases to a vast number of useful objects in Cassandra:
Phantom connectors
Cassandra Consistency Levels
Keyspaces
Using a potentially null value as part of a PRIMARY KEY in CQL is also wrong as no part of the CQL primary can be null. It's a much better idea to move processing logic outside of phantom.
Traditionally, a tables -> db service -> api controller -> api approach is the way to build modular applications with better separation of concerns. It's best to keep simple I/O at table level, app level consistency at db service level, and all processing logic at a higher level.
Hope this helps.
Using the
import com.websudos.phantom.Implicits._
works!!!

Slick string sanitation

I want to perform a query using the SQL LIKE operation with a string parameter.
Example:
coffee <- Coffees if coffee.name like s"%$queryString%"
Is it safe?
From Slick documentation :
Slick’s key feature are type-safe, composable queries. Slick comes with a Scala-to-SQL compiler, which allows a (purely functional) sub-set of the Scala language to be compiled to SQL queries [...]
The fact that such queries are type-safe not only catches many mistakes early at compile time, but also eliminates the risk of SQL injection vulnerabilities
I did no try myself, but I think you are safe even when using user params

Integration tests in Scala when using compagnons with Play2? -> Cake pattern?

I'm working on my first Scala application, where we use an ActiveRecord style to retrieve data from MongoDB.
I have models like User and Category, which all have a companion object that uses the trait:
class MongoModel[T <: IdentifiableModel with CaseClass] extends ModelCompanion[T, ObjectId]
ModelCompanion is a Salat class which provide common MongoDB crud operations.
This permits to retrieve data like this:
User.profile(userId)
I never had any experience with this ActiveRecord query style. But I know Rails people are using it. And I think I saw it on Play documentation (version 1.2?) to deal with JPA.
For now it works fine, but I want to be able to run integration tests on my MongoDB.
I can run an "embedded" MongoDB with a library. The big deal is that my host/port configuration are actually kind of hardcoded on the MongoModel class which is extended by all the model companions.
I want to be able to specify a different host/port when I run integration tests (or any other "profile" I could create in the future).
I understand well dependency injection, using Spring for many years in Java, and the drawbacks of all this static stuff in my application. I saw that there is now a scala-friendly way to configure a Spring application, but I'm not sure using Spring is appropriate in Scala.
I have read some stuff about the Cake pattern and it seems to do what I want, being some kind of typesafe, compile-time-checked spring context.
Should I definitely go to the Cake pattern, or is there any other elegant alternative in Scala?
Can I keep using an ActiveRecord style or is it a total anti-pattern for testability?
Thanks
No any static references - using Cake pattern you got 2 different classes for 2 namespaces/environments, each overriding "host/port" resource on its own. Create a trait containing your resources, inherit it 2 times (by providing actual information about host/port, depending on environment) and add to appropriate companion objects (for prod and for test). Inside MongoModel add self type that is your new trait, and refactor all host/port references in MongoModel, to use that self type (your new trait).
I'd definitely go with the Cake Pattern.
You can read the following article with show an example of how to use the Cake Pattern in a Play2 application:
http://julien.richard-foy.fr/blog/2011/11/26/dependency-injection-in-scala-with-play-2-it-s-free/

Scala, Morphia and Enumeration

I need to store Scala class in Morphia. With annotations it works well unless I try to store collection of _ <: Enumeration
Morphia complains that it does not have serializers for that type, and I am wondering, how to provide one. For now I changed type of collection to Seq[String], and fill it with invoking toString on every item in collection.
That works well, however I'm not sure if that is right way.
This problem is common to several available layers of abstraction on the top of MongoDB. It all come back to a base reason: there is no enum equivalent in json/bson. Salat for example has the same problem.
In fact, MongoDB Java driver does not support enums as you can read in the discussion going on here: https://jira.mongodb.org/browse/JAVA-268 where you can see the problem is still open. Most of the frameworks I have seen to use MongoDB with Java do not implement low-level functionalities such as this one. I think this choice makes a lot of sense because they leave you the choice on how to deal with data structures not handled by the low-level driver, instead of imposing you how to do it.
In general I feel that the absence of support comes not from technical limitation but rather from design choice. For enums, there are multiple way to map them with their pros and their cons, while for other data types is probably simpler. I don't know the MongoDB Java driver in detail, but I guess supporting multiple "modes" would have required some refactoring (maybe that's why they are talking about a new version of serialization?)
These are two strategies I am thinking about:
If you want to index on an enum and minimize space occupation, you will map the enum to an integer ( Not using the ordinal , please can set enum start value in java).
If your concern is queryability on the mongoshell, because your data will be accessed by data scientist, you would rather store the enum using its string value
To conclude, there is nothing wrong in adding an intermediate data structure between your native object and MongoDB. Salat support it through CustomTransformers, on Morphia maybe you would need to do the conversion explicitely. Go for it.

casbah mongodb more typesafe way to access object parameters

In casbah, there are two methods called .getAs and .getAsOrElse in MongoDBObject, which returns the relevant fields' values in the type which given as the type parameter.
val dbo:MongoDBObject = ...
dbo.getAs[String](param)
This must be using type casting, because we can get a Long as a String by giving it as the type parameter, which might caused to type cast exception in runtime. Is there any other typesafe way to retrieve the original type in the result?
This must be possible because the type information of the element should be there in the getAs's output.
Check out this excellent presentation on Salat by it's author. What you're looking for is Salat grater which can convert to and from DBObject.
Disclamer: I am biased as I'm the author of Subset
I built this small library "Subset" exactly for the reason to be able to work effectively with DBObject's fields (both scalar and sub-documents) in a type-safe manner. Look through Examples and see if it fits your needs.
The problem is that mongodb can store multiple types for a single field, so, I'm not sure what you mean by making this typesafe. There's no way to enforce it on the database side, so were you hoping that there is a way to enforce it on the casbah side? You could just do get("fieldName"), and get an Object, to be safest--but that's hardly an improvement, in my opinion.
I've been happy using Salat + Casbah, and when my database record doesn't match my Salat case class, I get a runtime exception. I just know that I have to run migration scripts when I change the types in my model, or create a new model for the new types (multiple models can be stored in the same collection). At least the Salat grater/DAO methods make it less of a hassle (you don't have to specify types every time you access a variable).