Currently I'm using GridGain/Ignite in my project and faced with some problems:
As you may know, GridGain can hold any serializable object in Cache, like this:
val mycache = ignite.getOrCreateCache[String,MyClass]("MyName")
It means, that we can define our class and extend it with Dynamic property - that's ok.
If we set Ignite-annotation (#QuerySqlField) at specific class-field - Ignite can use sql-queries with your classes like this:
val sql = select * from MyClass
mycache.query(new SqlFieldsQuers(sql))
And now my question:
How can I set Ignite-annotations with dynamic fields in dynamic classes in Scala? I've attached my dynamic class definition and hope for some help..
class DynamicType extends Dynamic with Serializable
{
private val fields = mutable.Map.empty[String,Any].withDefault{key=>throw new NoSuchFieldError(key)}
def selectDynamic(key: String) = fields(key)
def updateDynamic(key: String)(value: Any) = fields(key) = value
def applyDynamic(key: String)(args: Any*) = fields(key)
}
As I understand your dynamic type implementation will represent just a map of fields. In this case Ignite will serialize that map as DynamicType instance field. So it's like any object with field of Map type. Map's key/value pairs can't be annotated and can't be indexed by Ignite.
Related
I have a class like this -
class Cache (
tableName: String,
TTL: Int) {
// Creates a cache
}
I have a companion object that returns different types of caches. It has functions that require a base table name and can construct the cache.
object Cache {
def getOpsCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_ops", OpsTTL);
}
def getSnapshotCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_snaps", SnapshotTTL);
}
def getMetadataCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_metadata", MetadataTTL);
}
}
The object does a few more things and the Cache class has more parameters, which makes it useful to have a companion object to create different types of Caches. The baseTableName parameter is same for all of the caches. Is there a way in which I can pass this parameter only once and then just call the functions to get different types of caches ?
Alternative to this is to create a factory class and pass the baseTableName parameter to constructor and then call the functions. But I am wondering if it could be done in any way with the Companion object.
The simplest way is to put your factory in a case class:
case class CacheFactory(baseTableName: String) {
lazy val getOpsCache: Cache =
Cache(s"baseTableName_ops", OpsTTL)
lazy val getSnapshotCache =
Cache(s"baseTableName_snaps", SnapshotTTL)
lazy val getMetadataCache =
Cache(s"baseTableName_metadata", MetadataTTL)
}
As I like case classes I changed your Cache also to a case class:
case class Cache(tableName: String, TTL: Int)
As you can see I adjusted your Java code to correct Scala code.
If you want to put it in the companion object, you could use implicits, like:
object Cache {
def getOpsCache(implicit baseTableName: String): Cache =
Cache(s"baseTableName_ops", OpsTTL)
def getSnapshotCache(implicit baseTableName: String) =
Cache(s"baseTableName_snaps", SnapshotTTL)
def getMetadataCache(implicit baseTableName: String) =
Cache(s"baseTableName_metadata", MetadataTTL)
}
Then your client looks like:
implicit val baseTableName: String = "baseName"
cache.getSnapshotCache
cache.getMetadataCache
Consider creating algebraic data type like so
sealed abstract class Cache(tablePostfix: String, ttl: Int) {
val tableName = s"baseTableName_$tablePostfix"
}
case object OpsCache extends Cache("ops", 60)
case object SnapshotCache extends Cache("snaps", 120)
case object MetadataCache extends Cache("metadata", 180)
OpsCache.tableName // res0: String = baseTableName_ops
I have a simple class-level annotation written in Java:
#Target({ElementType.TYPE})
#Retention(RetentionPolicy.RUNTIME)
public #interface Collection {
String name();
}
used like:
#Collection(name="mytable")
case class Foo(...)
I need to introspect classes in Scala 2.11 to obtain the value of the name parameter. How can I get this info? I'm up to here:
val sym = currentMirror.classSymbol(Class.forName(fullName))
val anno = sym.annotations.head
val annoType = anno.tree.tpe // I can get this...works
println(anno.tree.children.tail) // prints List(name = "mytable")
I'm close! I can see my name parameter and its value but this doesn't seem to be accessible like a Map or anything friendly. How can I get the value of my annotation's parameter?
The tree api implements product to get elements out, so this is kind of a hacky demonstration, but you can get out your element:
println(anno.tree.children.last.productElement(1)) // prints "mytable"
If you can handle using Jackson, then I'd re-use its annotation processing functionality instead of using scala reflection.
object Test {
#Collection(name="mytable")
case class Foo(bar: String)
def main(args: Array[String]): Unit = {
val introspector = new JacksonAnnotationIntrospector
val ac = AnnotatedClass.construct(classOf[Foo], introspector, null)
val annotation = ac.getAnnotations.get(classOf[Collection])
println(annotation.name())
}
}
If the class does not have the annotation then annotation is null.
I just can't figure it out. What I am using right now is:
abstract class DBEnumString extends Enumeration {
implicit val enumMapper = MappedJdbcType.base[Value, String](
_.toString(),
s => this.withName(s)
)
}
And then:
object SomeEnum extends DBEnumString {
type T = Value
val A1 = Value("A1")
val A2 = Value("A2")
}
The problem is, during insert/update JDBC driver for PostgreSQL complains about parameter type being "character varying" when column type is "some_enum", which is reasonable as I am converting SomeEnum to String.
How do I tell Slick to treat String as DB-defined "enum_type"? Or how to define some other Scala-type that will map to "enum_type"?
I had similar confusion when trying to get my postgreSQL enums to work with slick. Slick-pg allows you to use Scala enums with your databases enums, and the test suite shows how.
Below is an example.
Say we have this enumerated type in our database.
CREATE TYPE Dog AS ENUM ('Poodle', 'Labrador');
We want to be able to map these to Scala enums, so we can use them happily with Slick. We can do this with slick-pg, an extension for slick.
First off, we make a Scala version of the above enum.
object Dogs extends Enumeration {
type Dog = Value
val Poodle, Labrador = Value
}
To get the extra functionality from slick-pg we extend the normal PostgresDriver and say we want to map our Scala enum to the PostgreSQL one (remember to change the slick driver in application.conf to the one you've created).
object MyPostgresDriver extends PostgresDriver with PgEnumSupport {
override val api = new API with MyEnumImplicits {}
trait MyEnumImplicits {
implicit val dogTypeMapper = createEnumJdbcType("Dog", Dogs)
implicit val dogListTypeMapper = createEnumListJdbcType("Dog", Dogs)
implicit val dogColumnExtensionMethodsBuilder = createEnumColumnExtensionMethodsBuilder(Dogs)
implicit val dogOptionColumnExtensionMethodsBuilder = createEnumOptionColumnExtensionMethodsBuilder(Dogs)
}
}
Now when you want to make a new model case class, simply use the corresponding Scala enum.
case class User(favouriteDog: Dog)
And when you do the whole DAO table shenanigans, again you can just use it.
class Users(tag: Tag) extends Table[User](tag, "User") {
def favouriteDog = column[Dog]("favouriteDog")
def * = (favouriteDog) <> (Dog.tupled, Dog.unapply _)
}
Obviously you need the Scala Dog enum in scope wherever you use it.
Due to a bug in slick, currently you can't dynamically link to a custom slick driver in application.conf (it should work). This means you either need to run play framework with start and not get dynamic recompiling, or you can create a standalone sbt project with just the custom slick driver in it and depend on it locally.
I'm new to scala and can't get my head around how the Lift guys implemented the Record API. However, the question is less about this API but more about Scala in general. I'm interested in how the object in class pattern works, used in Lift.
class MainDoc private() extends MongoRecord[MainDoc] with ObjectIdPk[MainDoc] {
def meta = MainDoc
object name extends StringField(this, 12)
object cnt extends IntField(this)
}
object MainDoc extends MainDoc with MongoMetaRecord[MainDoc]
In the upper snippet you can see how a record is defined in Lift. The interesting part is that the fields are defined as objects. The API allows you to create Instances like this:
val md1 = MainDoc.createRecord
.name("md1")
.cnt(5)
.save
This is probably done by using the apply method? But at the same time you are able to get the values by doing something like this:
val name = md1.name
How does this all work? Are the objects not that static when in scope of an class. Or are they just constructor classes for some internal representation? How is it possible to iterate over all fields, do you use Reflection?
Thanks,
Otto
Otto,
You are more of less on the right track. You actually don't need to define your fields as objects, you could have written your example as
class MainDoc private() extends MongoRecord[MainDoc] with ObjectIdPk[MainDoc] {
def meta = MainDoc
val name = new StringField(this, 12)
val cnt= new IntField(this)
}
object MainDoc extends MainDoc with MongoMetaRecord[MainDoc]
The net.liftweb.record.Field trait does contain an apply method that is the equivalent to set. That's why you can assign the fields by name after instantiating the object.
The field reference you mentioned:
val name = md1.name
Would type name as a StringField. If what you were thinking was
val name: String = md1.name
that would fail to compile (unless there was an implicit in scope to convert Field[T] => T). The proper way retrieve the String value of the field would be
val name = md1.name.get
Record does use reflection to gather the fields. When you define an object within a class, the compiler will create a field to hold the object instance. From the standpoint of reflection, the object appears very similar to the alternate way to define a field that I mentioned before. Each of the definitions probably creates a subclass of the field type, but that's no different than
val name = new StringField(this, 12) {
override def label: NodeSeq = <span>My String Field</span>
}
You're right about it being the apply method. Record's Field base class defines a few apply methods.
def apply(in: Box[MyType]): OwnerType
def apply(in: MyType): OwnerType
By returning the OwnerType, you can chain invocations together.
Regarding the use of object to define fields, that confused me at first, too. The object identifier defines an object within a particular scope. Even though it's convenient to think of object as a shortcut for the singleton pattern, it's more flexible than that. According to the Scala Language Spec (section 5.4):
It is roughly equivalent to the following definition of a lazy value:
lazy val m = new sc with mt1 with ... with mtn { this: m.type => stats }
<snip/>
The expansion given above is not accurate for top-level objects. It cannot be because variable and method definition cannot appear on the top-level outside of a
package object (§9.3). Instead, top-level objects are translated to static fields.
Regarding iterating over all the fields, Record objects define a allFields method which returns a List[net.liftweb.record.Field[_, MyType]].
Schema.org is markup vocabulary (for the web) and defines a number of types in terms of properties (no methods). I am currently trying to model parts of that schema in Scala as internal model classes to be used in conjunction with a document-oriented database (MongoDB) and a web framework.
As can be seen in the definition of LocalBusiness, schema.org uses multiple inheritance to also include properties from the "Place" type. So my question is: How would you model such a schema in Scala?
I have come up with two solutions so far. The first one use regular classes to model a single inheritance tree and uses traits to mixin those additional properties.
trait ThingA {
var name: String = ""
var url: String = ""
}
trait OrganizationA {
var email: String = ""
}
trait PlaceA {
var x: String = ""
var y: String = ""
}
trait LocalBusinessA {
var priceRange: String = ""
}
class OrganizationClassA extends ThingA with OrganizationA {}
class LocalBusinessClassA extends OrganizationClassA with PlaceA with LocalBusinessA {}
The second version tries to use case classes. However, since case class inheritance is deprecated, I cannot model the main hierarchy so easily.
trait ThingB {
val name: String
}
trait OrganizationB {
val email: String
}
trait PlaceB {
val x: String
val y: String
}
trait LocalBusinessB {
val priceRange: String
}
case class OrganizationClassB(val name: String, val email: String) extends ThingB with OrganizationB
case class LocalBusinessClassB(val name: String, val email: String, val x: String, val y: String, val priceRange: String) extends ThingB with OrganizationB with PlaceB with LocalBusinessB
Is there a better way to model this? I could use composition similar to
case class LocalBusinessClassC(val thing:ThingClass, val place: PlaceClass, ...)
but then of course, LocalBusiness cannot be used when a "Place" is expected, for example when I try to render something on Google Maps.
What works best for you depends greatly on how you want to map your objects to the underlying datastore.
Given the need for multiple inheritance, and approach that might be worth considering would be to just use traits. This gives you multiple inheritance with the least amount of code duplication or boilerplating.
trait Thing {
val name: String // required
val url: Option[String] = None // reasonable default
}
trait Organization extends Thing {
val email: Option[String] = None
}
trait Place extends Thing {
val x: String
val y: String
}
trait LocalBusiness extends Organization with Place {
val priceRange: String
}
Note that Organization extends Thing, as does Place, just as in schema.org.
To instantiate them, you create anonymous inner classes that specify the values of all attributes.
object UseIt extends App {
val home = new Place {
val name = "Home"
val x = "-86.586104"
val y = "34.730369"
}
val oz = new Place {
val name = "Oz"
val x = "151.206890"
val y = "-33.873651"
}
val paulis = new LocalBusiness {
val name = "Pauli's"
override val url = "http://www.paulisbarandgrill.com/"
val x = "-86.713660"
val y = "34.755092"
val priceRange = "$$$"
}
}
If any fields have a reasonable default value, you can specify the default value in the trait.
I left fields without value as empty strings, but it probably makes more sense to make optional fields of type Option[String], to better indicate that their value is not set. You liked using Option, so I'm using Option.
The downside of this approach is that the compiler generates an anonymous inner class every place you instantiate one of the traits. This could give you an explosion of .class files. More importantly, though, it means that different instances of the same trait will have different types.
Edit:
In regards to how you would use this to load objects from the database, that depends greatly on how you access your database. If you use an object mapper, you'll want to structure your model objects in the way that the mapper expects them to be structured. If this sort of trick works with your object mapper, I'll be surprised.
If you're writing your own data access layer, then you can simply use a DAO or repository pattern for data access, putting the logic to build the anonymous inner classes in there.
This is just one way to structure these objects. It's not even the best way, but it demonstrates the point.
trait Database {
// treats objects as simple key/value pairs
def findObject(id: String): Option[Map[String, String]]
}
class ThingRepo(db: Database) {
def findThing(id: String): Option[Thing] = {
// Note that in this way, malformed objects (i.e. missing name) simply
// return None. Logging or other responses for malformed objects is left
// as an exercise :-)
for {
fields <- db.findObject(id) // load object from database
name <- field.get("name") // extract required field
} yield {
new Thing {
val name = name
val url = field.get("url")
}
}
}
}
There's a bit more to it than that (how you identify objects, how you store them in the database, how you wire up repository, how you'll handle polymorphic queries, etc.). But this should be a good start.