I want to store java.util.Date (or Timestamp) as an int (or Long) in my db using Squeryl.
I'd like to control how the date is transformed to the number and vice versa.
How can I achieve this?
I'm pretty new to scala/squeryl, coming from java/hibernate.
Back in java/hibernate I could create user types and either register them globaly or use them localy on a field with an annotation. This user type defined methods for how to persist the object type to db and how to load it from the db.
I read some of the squeryl and scala docs, noticed two things:
there are custom types
there is a implicit function mechanism that is called for conversions
I know one of these can help me but I didn't find any good full examples to understand how.
Any help is appreciated!
Please see this example :
https://github.com/max-l/squeryl-extended-field-types-example/blob/master/src/main/scala/example/MyCustomTypes.scala
in your case, you subsitute TTimestamp for TLong (the backing JDBC type) and DateTime for Date (you might want consider using the joda date though).
implicit val dateAsLongTEF = new NonPrimitiveJdbcMapper[Long, Date, TLong](longTEF, this) {
def convertFromJdbc(v: Long) = new Date(v)
def convertToJdbc(v: Date) = v.getTime
}
implicit val optionDateAsLongTEF =
new TypedExpressionFactory[Option[Date], TOptionLong]
with DeOptionizer[Long, Date, TLong, Option[Date], TOptionLong] {
val deOptionizer = dateAsLongTEF
}
Note: the fact that you use TLong and TOptionLong means that you'll be able to
compare a number column to a long backed column in the DSL.
Update: there is a limitation that prevents re registering a primitive type,
so you'll need to have a wrapper type, I updated the example in the github project...
Related
I recently add a new field to my scala case class since I want to start keeping track of that field in my MongoDB.
let's say case class is like this:
case class MyItem (
var existingField: String,
var newlyAddedField: String
) {
}
I use this to serialize/deserialize json and bson to my object:
object MyItem {
import play.api.libs.json.Json
// generate reader and writer:
implicit var jsonFormats = Json.format[MyItem]
implicit var bsonFormats = Macros.handler[MyItem]
}
As all the existing data in my db doesn't have newlyAddedField, I get runtime exception
reactivemongo.bson.exceptions.DocumentKeyNotFound: The key 'newlyAddedField' could not be found in this document or array
Could anyone help? I've read about writing my own serialization/deserialization but I am not sure how to do that as I am using Play Framework whose syntax and way to do things are all different among its versions. I hope there is a simpler way as adding field should be common in NoSQL db. Btw, I am using play framework 2.5
Thanks in advance!
AFAIU it Macros.handler is null-safe i.e. it doesn't sets value to null if there is no field. I think the simplest and the cleanest fix this in Scala is to declare your new field as Option[String] so everyone (including the macro code-generator) will see that this field might be absent. And this seems to be what the doc suggests as well
Scala Issue:
JSON data is extracted and stored into case class, the time string data needs to be converted to sql timestamp for
Spark dataframe and to Java/Joda Date for Salat DAO/Mongo DB store.
And both don't support each other format.
Currently we are using two case class for same:
case class A(a:int, b:string, time:java.sql.timestamp)
case class B(a:int, b:string, time:java.util.Date)
So an Json Extractor method populates either of the above two case class based on the store type Spark/Mongo
Is there a better way to handle this ? (composite class is one way but again it gets too nested)
Do Note, the case class can even be Nested, (A containing C and D, which in turn can have time arguments within them)
I would think about the application domain first. Timestamp or Date is an implementation detail depending on your data store.
My suggested solution would be
case class MyDomainObject(a: Int, b: String, time: java.util.Instant)
object MyDomainObject {
def fromMongoObject(o: MyDomainMongoObject): MyDomainObject = ???
def fromSparkObject(o: MyDomainSparkObject): MyDomainObject = ???
}
(NOTE, I picked java.util.Instant as an example, you can choose whatever time representation you prefer)
And the classes/functions that deal with Mongo/Spark will extract object respectively in MyDomainMongoObject and MyDomainSparkObject, that will be then converted using the methods in the companion object. This way you keep your domain clean by thinking only about one particular type of time representation, but every datastore adapter can use its own type.
I have several objects that closely (but not perfectly) mirror other objects in Scala. For example, I have a PackagedPerson that has all of the same fields as the PersonModel object, plus some. (The PackagedPerson adds in several fields from other entities, things that are not on the PersonModel object).
Generally, the PackagedPerson is used for transmitting a "package" of person-related things over REST, or receiving changes back (again over REST).
When preparing these transactions, I have a pack method, such as:
def pack(p: PersonModel): PackagedPerson
After all the preamble is out of the way (for instance, loading optional, extra objects that will be included in the package), I create a PackagedPerson from the PersonModel and "everything else:"
new PackagedPerson(p.id, p.name, // these (and more) from the model object
x.profilePicture, y.etc // these from elsewhere
)
In many cases, the model object has quite a few fields. My question is, how can I minimize repetitive code.
In a way it's like unapply and apply except that there are "extra" parameters, so what I really want is something like this:
new PackagePerson(p.unapply(), x.profilePicture, y.etc)
But obviously that won't work. Any ideas? What other approaches have you taken for this? I very much want to keep my REST-compatible "transport objects" separate from the model objects. Sometimes this "packaging" is not necessary, but sometimes there is too much delta between what goes over the wire, and what gets stored in the database. Trying to use a single object for both gets messy fast.
You could use LabelledGeneric from shapeless.
You can convert between a case class and its a generic representation.
case class Person(id: Int, name: String)
case class PackagedPerson(id: Int, name: String, age: Int)
def packagePerson(person: Person, age: Int) : PackagedPerson = {
val personGen = LabelledGeneric[Person]
val packPersonGen = LabelledGeneric[PackagedPerson]
// turn a Person into a generic representation
val rec = personGen.to(person)
// add the age field to the record
// and turn the updated record into a PackagedPerson
packPersonGen.from(rec + ('age ->> age))
}
Probably the order of the fields of your two case classes won't correspond as nice as my simple example. If this is the case shapeless can reorder your fields using Align. Look at this brilliant answer on another question.
You can try Java/Scala reflection. Create a method that accepts a person model, all other models and model-free parameters:
def pack(p: PersonModel, others: Seq[Model], freeParams: (String, Any)*): PackedPerson
In the method, you reflectively obtain PackedPerson's constructor, see what arguments go there. Then you (reflectively) iterate over the fields of PersonModel, other models and free args: if there's a field the name and type of which are same as one of the cunstructor params, you save it. Then you invoke the PackedPerson constructor reflectively using saved args.
Keep in mind though, that a case class can contain only up to 22 constructor params.
I would like to define my Primary Keys as specific types - not just Long or String
For example
case class Project(
var id: ProjectId = 0,
One advantage of this is if I accidently compare different keys - then the compiler will pick it up.
Obviously this gives the compile error
overriding method id in trait KeyedEntity of type => Long;
variable id has incompatible type
Are there any example's where this type of approach is successfully implemented?
Appendix - a draft of what ProjectId could be
trait SelfType[T] {
val self : T
}
class Content_typeId( val self: Int) extends SelfType[Int]
class ProjectId( val self: Long) extends SelfType[Long]
object ProjectId {
implicit def baseToType(self: Long) = new ProjectId(self)
implicit def typeToBase(higherSelf: ProjectId) : Long = higherSelf.self
}
Thanks
Brent
Yup, it can be done, but you are going to want to upgrade to Squeryl 0.9.6. The latest available is RC3 at the moment. There are 2 changes that you'll want to take advantage of:
You no longer need to extend KeyedEntity. Instead, you can define an implicit KeyedEntityDef that Squeryl will use to determine which field(s) of your object constitute the primary key.
Squeryl 0.9.6 allows you to extend what types are supported using type classes.
RC3 is very stable, I'm using it myself in several production projects, but there is no official documentation for these features yet. You can find examples on the list, where I see you've also posted this question.
I'd also suggest looking at how both PrimitiveTypeMode (the process of exposing a TEF for a type) and PrimitiveTypeSupport (which is where the TypedExpressionFactory instances are defined). KeyedEntity itself is supported with a KeyedEntityDef By Squeryl and looking at that code may be helpful as well.
I'm trying to make a EnumListField in Lift/Record/Squeryl, similar to MappedEnumList in LiftMapper. The storage type should be Long/BIGINT. I understand that if I define:
def classOfPersistentField = classOf[Long]
Then Squeryl will know it should create a BIGINT column. And I know it uses setFromAny() to set the value, passing in the Long. The one piece I don't get is:
How will it read the field's value? If it uses valueBox, it will get a Seq[Enum#Value], and it won't know how to turn that into a Long.
How do I tell Squeryl to convert my Seq[Enum#Value] to a Long, or define a "getter" that returns a Long, and that doesn't conflict with the "normal" getter(s)?
you are implementing your validation logic incorrectly. The correct way to validate a Record field is to override
def validations: List[ValidationFunction]
where ValidationFunction is a type alias
type ValidationFunction = ValueType => List[FieldError]
and in your case ValueType == String.
The next issue is your Domain trait. Because your call to validate is inlined into the class definition, it will be called when your field is constructed.