How to create String primary Key in lift's Mapper ORM? - lift

How can we create a mapper with string as it's primary key in lift's Mapper ORM ?

As per my knowledge this should work..
class StringCodes extends KeyedMapper[String,StringCodes] {
def getSingleton = StringCodes
def primaryKeyField = languageCd
object strCd extends MappedStringIndex(this,5)
{
override def writePermission_? = true // if u want to set it via your code, keep this true
override def dbAutogenerated_? = false
override def dbNotNull_? = true
override def dbColumnName="str_cd"
}
....

From Lift documentation here:
Naturally Mapper also supports String primary keys, though your model class and companion object will need to mixin different traits and you’ll need to have a MappedStringIndex field.

Related

Ignite Annotations in Scala.Dynamic

Currently I'm using GridGain/Ignite in my project and faced with some problems:
As you may know, GridGain can hold any serializable object in Cache, like this:
val mycache = ignite.getOrCreateCache[String,MyClass]("MyName")
It means, that we can define our class and extend it with Dynamic property - that's ok.
If we set Ignite-annotation (#QuerySqlField) at specific class-field - Ignite can use sql-queries with your classes like this:
val sql = select * from MyClass
mycache.query(new SqlFieldsQuers(sql))
And now my question:
How can I set Ignite-annotations with dynamic fields in dynamic classes in Scala? I've attached my dynamic class definition and hope for some help..
class DynamicType extends Dynamic with Serializable
{
private val fields = mutable.Map.empty[String,Any].withDefault{key=>throw new NoSuchFieldError(key)}
def selectDynamic(key: String) = fields(key)
def updateDynamic(key: String)(value: Any) = fields(key) = value
def applyDynamic(key: String)(args: Any*) = fields(key)
}
As I understand your dynamic type implementation will represent just a map of fields. In this case Ignite will serialize that map as DynamicType instance field. So it's like any object with field of Map type. Map's key/value pairs can't be annotated and can't be indexed by Ignite.

How to override the toString() method on JS objects to use JSON.stringify()?

I'm tired of writing blah: "${JSON.stringify(target)}" when I deal with my DTO objects, I just want to write blah: "$target".
My DTOs look like:
#js.native
trait AuthConnectionDetails extends js.Object {
def clientId: String = js.native
def accountHostname: String = js.native
}
These DTOs are used to parse the content of some REST API calls, like:
val response = js.JSON.parse(xmlHttpRequest.responseText).
asInstanceOf[AuthConnectionDetails]
I don't mind changing how I define my DTO objects to do this (maybe I should be using case classes for my DTOs or something, instead of native js traits?), but I can't figure out how to do it.
I tried writing a trait that I could mixin, but that didn't work and I tried writing an implicit extension method but that didn't work either.
My implicit code that didn't seem to work for toString:
object JsonToString {
implicit class JsObjectExtensions(val target: js.Object) extends AnyVal {
override def toString:String = JSON.stringify(target)
def json:String = JSON.stringify(target)
}
}
So I can do blah: "${target.json}", which is better - but I'd especially like to get rid of those braces.
Is there any way to do this with scala.js?
No, there is no way to do this. That's because string interpolation will always use the toString() method of the object itself, no matter what is declared in its types or in implicit classes (this is a Scala thing in general).
The only way you could achieve this would be to actually modify the objects by patching them up with a custom toString() method every time you create one. That would include when you parse them from a JSON string. I'm pretty sure that would be worse than calling .json when you stringify them.
If you really want to, you could write your custom string interpolator:
implicit class JsonHelper(private val sc: StringContext) extends AnyVal {
def dejson(args: Any*): JSONObject = {
sc.checkLengths(args)
s(args.map(jsonify))
}
private def jsonify(arg: Any) = arg match {
case obj: js.Object => JSON.stringify(obj)
case _ => arg.toString
}
}
You can now use this like this:
dejson"hello: $target, world: $target2"

How to model nested class with Phantom Cassandra driver

I have a case class that has a number of nested classes.
How do I model with is Phantom DSL
Putting it all into one case class is not an option.
For example:
case class Car(age: Int,size: Int,door: Door)
case class Door(color:String, size:Int)
Thanks
Well, when modeling things on Cassandra, you should have in mind that it does not work like relational databases and phantom is not a kind of hibernate.
One important thing when modeling is to consider the queries you want to do, but let's get to the point.
Phantom does allow you to model nested classes, using the json table.
Consider the following:
case class JsonTest(prop1: String, prop2: String)
case class JsonClass(
id: UUID,
name: String,
json: JsonTest,
jsonList: List[JsonTest],
jsonSet: Set[JsonTest]
)
You have inside the JsonClass 3 columns with JsonTest case class type.
When declaring your fields, you should do something like this:
object json extends JsonColumn[ConcreteJsonTable, JsonClass, JsonTest](this) {
override def fromJson(obj: String): JsonTest = {
JsonParser.parse(obj).extract[JsonTest]
}
override def toJson(obj: JsonTest): String = {
compactRender(Extraction.decompose(obj))
}
}
object jsonList extends JsonListColumn[ConcreteJsonTable, JsonClass, JsonTest](this) {
override def fromJson(obj: String): JsonTest = {
JsonParser.parse(obj).extract[JsonTest]
}
override def toJson(obj: JsonTest): String = {
compactRender(Extraction.decompose(obj))
}
}
object jsonSet extends JsonSetColumn[ConcreteJsonTable, JsonClass, JsonTest](this) {
override def fromJson(obj: String): JsonTest = {
JsonParser.parse(obj).extract[JsonTest]
}
override def toJson(obj: JsonTest): String = {
compactRender(Extraction.decompose(obj))
}
}
Basically what phantom is doing is to save a string json representation inside a string column.
source: https://github.com/outworkers/phantom/blob/develop/phantom-dsl/src/test/scala/com/websudos/phantom/tables/JsonTable.scala
You can't really do that because it's not Hibernate or something like that. You need to use the nested class' ID, like this:
case class Car(age: Int,size: Int, doorId: UUID)
case class Door(id: UUID, color:String, size:Int)
Then just add a function to the case class that gives back the Door object calling a getById on that.
Try simpledba https://github.com/doolse/simpledba
It seems to define a relational view over columnar databases.

Convenient way of inserting values to Cassandra with Phantom

Does anyone know of a convenient way of inserting values to Cassandra via phatom-dsl? Currently I'm doing this:
case class Entry(id: UUID, firstName: String, lastName: String)
sealed class EntryTable extends CassandraTable[EntryTable, Entry] {
override val tableName = "entries"
object id extends UUIDColumn(this) with Index[UUID]
object firstName extends StringColumn(this)
object lastName extends StringColumn(this)
override def fromRow(r: dsl.Row): Entry = {
Entry(id(r), firstName(r), firstName(r))
}
}
object EntryTable extends EntryTable {
private val connector = CassandraConnector.apply(Set(InetAddress.getByName("localhost")))
implicit val keySpace = KeySpace("keyspace")
def insert(e: Entry) = {
connector.withSessionDo(implicit session => {
insert().value(_.id, e.id)).value(_.firstName, e.firstName).value(_.lastName, e.lastName).future()
}
}
}
But I would like to do:
def insert(e: Entry) = {
connector.withSessionDo(implicit session => {
insert().value(e).future()
}
}
Which would be way more convenient when the case class has many fields. Any ideas?
You are using the API slightly wrong and we are in the process of publishing multiple tutorials to make the "new" way public. In the mean time, a basic version of it is available here and this branch in the activator template is also describing everything you need to know.
Specifically, the way to insert records is described here.

Can I override a scala class method with a method from a trait?

class PasswordCaseClass(val password:String)
trait PasswordTrait { self:PasswordCaseClass =>
override def password = "blue"
}
val o = new PasswordCaseClass("flowers") with PasswordTrait
Is it possible to override PasswordCaseClass's password with what is provided in PasswordTrait? Right now, I receive this error:
e.scala:6: error: overriding value password in class PasswordCase
Class of type String;
method password in trait PasswordTrait of type => java.lang.String needs to be a stable,
immutable value
val o = new PasswordCaseClass("flowers") with PasswordTrait
^
one error found
I would like to be able to have something like this:
class User(val password:String) {
}
trait EncryptedPassword { u:User =>
def password = SomeCriptographyLibrary.encrypt(u.password)
}
val u = new User("random_password") with EncryptedPassword
println(u.password) // see the encrypted version here
You can override a def with a val, but you can't do it the other way around. A val implies a guarantee -- that it's value is stable and immutable -- that a def does not.
This worked for me (with some modifications):
trait PasswordLike {
val password: String
}
class PasswordCaseClass(val password:String) extends PasswordLike
trait PasswordTrait extends PasswordLike {
override val password: String = "blue"
}
and then:
scala> val o = new PasswordCaseClass("flowers") with PasswordTrait
o: PasswordCaseClass with PasswordTrait = $anon$1#c2ccac
scala> o.password
res1: String = blue
You are trying to override the value with the method definition. It simply makes no sense - they have different semantics. Values supposed to be calculated once per object lifecycle (and stored within a final class attribute) and methods can be calculated multiple times. So what you are trying to do is to brake the contract of the class in a number of ways.
Anyway there is also compiler's fault - the error explanation is totally unclear.