Scala create random Objects for test in a functional style - scala

I am trying to figure out which would be the most functional style for this situation
I have a Image model
case class Image(
id: Int,
name: String,
title: String,
permalink: String,
url: String
)
I have a TestHelper object that helps me when I write tests, because it allows me to create random images objects
package utils
import models.Pet
import scala.util.Random
object TestHelper {
val random = new Random()
def randomId = random.nextInt(Integer.MAX_VALUE)
val nameList: List[String] = List("Joycelyn", "Shaunte", "Aurelio", "Jeane", "Carline", "Major", "Shawanna", "Hayden", "Benjamin", "Roxy", "Ardelia", "Yanira", "Tilda", "Claude", "Jonah", "Ilse", "Kieth", "Elmira", "Reid", "Bethann", "Catherine", "Yasuko", "Kia", "Merri", "Ethelyn", "Mallory", "Eustolia", "Matt", "Lynelle", "Christi", "Alane", "Miles", "Ressie", "Darryl", "Kathy", "Hiedi", "Kacy", "Cecila", "Tamar", "Dwayne", "Charlette", "Wanetta", "Sonja", "Celine", "Vina", "Teresa", "Dagny", "Debera", "Doreatha", "Wilda")
def randomImage: Image = {
var id = randomId
var name = nameList(random.nextInt(nameList.length))
var title = name
var permalink = name.toLowerCase
var logoUrl = s"https://www.images.com/${permalink}"
Image(id, name, title, permalink, logoUrl)
}
}
But I know that if I want to write in a functional style I should avoid using var. If I wouldn't use the field name, several times, it would be enough to replace all the vars with defs, but since I need to repeat the value, I am not sure how to write this in a functional style

Take a look at one of our libs(shameless disclaimer).
util-samplers
https://github.com/outworkers/util/blob/develop/util-samplers
It uses macros to navigate the structure of your case classes and generate appropriate samples. It's not a magic bullet but it will deal with most things most of the time, and it will also generate meaningful data wherever possible.
E.g if the field is called name, you will get a "Peter Smith" style result. It's also fully compatible with Scalacheck, but overall pretty basic, with a very simple macro. It's simplicity is guaranteed by having had me write it.
val imageGenerator = Sample.generator[Image]
implicit val imageArb = Sample.arbitrary[Image]
And you can plug that implicit in straight to your functional checkers.
forAll { img: Image => ....
}
If you don't want scalacheck at all, just use the basics:
import com.outworkers.util.samplers._
class MyTest extends FlatSpec {
it should "upload an image to S3" in {
val image = gen[Image]
val images = genList[Image](25)
}
}
If you cannot generate a type or the macro complains, simply write a sampler yourself. In most instances, you'd have something like a trait or object to hold all of them.
object ExtraSamples {
implicit val cantAutomateThis: Sample[java.net.Bla] = new Sample[java.net.Bla] {
override def sample: java.net.Bla = // in here you fill it in manuall....
}
}
Then if you have a case class with a java.net.Bla field, you simply import ExtraSamples._ in places where you do gen, and your manual implementation will be used to construct more complex ones. That's how you can support anything not supported out of the box.
scalacheck-shapeless
This is a different take on the same problem but instead of macros it uses automated typeclass instance derivation capabilities from shapeless. It's not wildly different in its approach from util-samplers, but the code might be slightly more complex, yet higher level.
https://github.com/alexarchambault/scalacheck-shapeless
import org.scalacheck.ScalacheckShapeless._
// If you defined:
// case class Foo(i: Int, s: String, blah: Boolean)
// case class Bar(foo: Foo, other: String)
// sealed trait Base
// case class BaseIntString(i: Int, s: String) extends Base
// case class BaseDoubleBoolean(d: Double, b: Boolean) extends Base
// then you can now do
implicitly[Arbitrary[Foo]]
implicitly[Arbitrary[Bar]]
implicitly[Arbitrary[Base]]
I've never done a side to side comparison, and they are not intended to compete with each other. The first one is extremely fast and lightweight and has minimal overhead as it's just one macro, the shapeless one is more involved and comes with much higher compilation times but it's likely more advanced in terms of what types it can auto-generate.

You can use ScalaCheck for this. ScalaCheck is a port of the functional language Haskell's library QuickCheck, which allows you to write random test example generators in a functional style.

In this particular case, you can simply replace all local vars by vals, because you are not mutating vars anyway.

Related

How to convert between to case classes with `mostly the same` fields using Scala Shapeless

Here I have to case classes which have mostly the same fields.
final case class Id(id: String) // Param Class
final case class Age(id: Id, age: Int) // Param Class
final case class A(id: Id, data: Map[String, Any], age: Age) extends Presentable[A, APre] // Main Class 1
final case class APre(id: String, data: Map[String, Any], age: Int) // Main Class 2
Here A and APre are my main classes.
Now I want to convert between this two class using Shapeless, so I write the following pseudo function:
trait Presentable[E, P] {
def makePresentation[ET <: HList, PT <: HList](entity: E)(func : ET => PT)(implicit entGen: LabelledGeneric.Aux[E, ET], preGen: LabelledGeneric.Aux[P, PT]): P = {
val entList = entGen.to(entity)
preGen.from(func(entList))
}
}
Here func is a mapper mapping the HList of A to HList of APre (or vice versa).
And I want to use the function like this:
val age = Age(Id("age_1"), 18)
val a = A(Id("id"), Map("tag1" -> "value1", "tag2" -> "value2"), age)
val pre = a.makePresentation { entList =>
entList.updateWith('id)((id: Id) => id.id).updateWith('age)((a: Age) => a.age)
}
Here I can imply the mapping function myself. So I can convert any two case classes
So questions are:
1. How can I convert this two classes using shapeless?
2. In fact, I have tons of pairs of class like A to APre. So I want a trait to extract this convert function using generic. How to write this function?
Disclaimer: I'm one of chimney's authors.
In earlier releases of chimney we implemented exactly what you are asking about - conversion between mostly identical case classes using shapeless.
I wouldn't recommend writing it by hand as there are some corner cases to consider (creating values if new object is missing some, transforming the fields that changed type/name, Java Beans, value classes, etc) and then you have to come up with how would you configure it, so if you need to have shapeless-bases solution look at the code from 0.1.10.
However, since 0.2.0, we rewritten the implementation into macros since if you had bigger case classes to transform (e.g. 12 fields or more) some derivations could compute several minutes(!) with no hope of improvement unless we dropped some of cases we support.
If you're just looking for a way of handling your transformations, then use newest chimney and call it a day.

Type-safe generic case class updates in Scala

I'm attempting to write some code that tracks changes to a record and applies them at a later date. In a dynamic language I'd do this by simply keeping a log of List[(String, Any)] pairs, and then simply applying these as an update to the original record when I finally decide to commit the changes.
I need to be able to introspect over the updates, so a list of update functions isn't appropriate.
In Scala this is fairly trivial using reflection, however I'd like to implement a type-safe version.
My first attempt was to try with shapeless. This works well if we know specific types.
import shapeless._
import record._
import syntax.singleton._
case class Person(name:String, age:Int)
val bob = Person("Bob", 31)
val gen = LabelledGeneric[Person]
val updated = gen.from( gen.to(bob) + ('age ->> 32) )
// Result: Person("Bob", 32)
However I can't figure out how to make this work generically.
trait Record[T]
def update( ??? ):T
}
Given the way shapeless handles this, I'm not sure if this would even be possible?
If I accept a lot of boilerplate, as a poor mans version I could do something along the lines of the following.
object Contact {
sealed trait Field[T]
case object Name extends Field[String]
case object Age extends Field[Int]
}
// A typeclass would be cleaner, but too verbose for this simple example.
case class Contact(...) extends Record[Contact, Contact.Field] {
def update[T]( field:Contact.Field[T], value:T ) = field match {
case Contact.Name => contact.copy( name = value )
case Contact.Age => contact.copy( age = value )
}
}
However this isn't particularly elegant and requires a lot of boilerplate. I could probably write my own macro to handle this, however it seems like a fairly common thing - is there a way to handle this with Shapeless or a similar macro library already?
How about using the whole instance of the class as an update?
case class Contact(name: String, age: Int)
case class ContactUpdate(name: Option[String] = None, age: Option[Int] = None)
object Contact {
update(target: Contact, delta: ContactUpdate) = Contact(
delta.name.getOrElse(target.name)
target.age.getOrElse(delta.age)
)
}
// also, optionally this:
object ContactUpdate {
apply(name: String) = ContactUpdate(name = Option(name))
apply(age: Int) = ContactUpdate(age = Option(age))
}
I think, if you want the really type-safe solution, this is the cleanest and most readable, and also, possibly the least pain to implement, as you don't need to deal with Records, lenses and individual field descriptors, just ContactUpdate(name="foo") creates an update, and updates.map(Contact.update(target, _)) applies them all in sequence.

Custom Scala enum, most elegant version searched

For a project of mine I have implemented a Enum based upon
trait Enum[A] {
trait Value { self: A =>
_values :+= this
}
private var _values = List.empty[A]
def values = _values
}
sealed trait Currency extends Currency.Value
object Currency extends Enum[Currency] {
case object EUR extends Currency
case object GBP extends Currency
}
from Case objects vs Enumerations in Scala. I worked quite nice, till I run into the following problem. Case objects seem to be lazy and if I use Currency.value I might actually get an empty List. It would have been possible to make a call against all Enum Values on startup so that the value list would be populated, but that would be kind of defeating the point.
So I ventured into the dark and unknown places of scala reflection and came up with this solution, based upon the following SO answers. Can I get a compile-time list of all of the case objects which derive from a sealed parent in Scala?
and How can I get the actual object referred to by Scala 2.10 reflection?
import scala.reflect.runtime.universe._
abstract class Enum[A: TypeTag] {
trait Value
private def sealedDescendants: Option[Set[Symbol]] = {
val symbol = typeOf[A].typeSymbol
val internal = symbol.asInstanceOf[scala.reflect.internal.Symbols#Symbol]
if (internal.isSealed)
Some(internal.sealedDescendants.map(_.asInstanceOf[Symbol]) - symbol)
else None
}
def values = (sealedDescendants getOrElse Set.empty).map(
symbol => symbol.owner.typeSignature.member(symbol.name.toTermName)).map(
module => reflect.runtime.currentMirror.reflectModule(module.asModule).instance).map(
obj => obj.asInstanceOf[A]
)
}
The amazing part of this is that it actually works, but it is ugly as hell and I would be interested if it would be possible to make this simpler and more elegant and to get rid of the asInstanceOf calls.
Here is a simple macro based implementation:
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
abstract class Enum[E] {
def values: Seq[E] = macro Enum.caseObjectsSeqImpl[E]
}
object Enum {
def caseObjectsSeqImpl[A: c.WeakTypeTag](c: blackbox.Context) = {
import c.universe._
val typeSymbol = weakTypeOf[A].typeSymbol.asClass
require(typeSymbol.isSealed)
val subclasses = typeSymbol.knownDirectSubclasses
.filter(_.asClass.isCaseClass)
.map(s => Ident(s.companion))
.toList
val seqTSymbol = weakTypeOf[Seq[A]].typeSymbol.companion
c.Expr(Apply(Ident(seqTSymbol), subclasses))
}
}
With this you could then write:
sealed trait Currency
object Currency extends Enum[Currency] {
case object USD extends Currency
case object EUR extends Currency
}
so then
Currency.values == Seq(Currency.USD, Currency.EUR)
Since it's a macro, the Seq(Currency.USD, Currency.EUR) is generated at compile time, rather than runtime. Note, though, that since it's a macro, the definition of the class Enum must be in a separate project from where it is used (i.e. the concrete subclasses of Enum like Currency). This is a relatively simple implementation; you could do more complicated things like traverse multilevel class hierarchies to find more case objects at the cost of greater complexity, but hopefully this will get you started.
A late answer, but anyways...
As wallnuss said, knownDirectSubclasses is unreliable as of writing and has been for quite some time.
I created a small lib called Enumeratum (https://github.com/lloydmeta/enumeratum) that allows you to use case objects as enums in a similar way, but doesn't use knownDirectSubclasses and instead looks at the body that encloses the method call to find subclasses. It has proved to be reliable thus far.
The article "“You don’t need a macro” Except when you do" by Max Afonov
maxaf describes a nice way to use macro for defining enums.
The end-result of that implementation is visible in github.com/maxaf/numerato
Simply create a plain class, annotate it with #enum, and use the familiar val ... = Value declaration to define a few enum values.
The #enum annotation invokes a macro, which will:
Replace your Status class with a sealed Status class suitable for acting as a base type for enum values. Specifically, it'll grow a (val index: Int, val name: String) constructor. These parameters will be supplied by the macro, so you don't have to worry about it.
Generate a Status companion object, which will contain most of the pieces that now make Status an enumeration. This includes a values: List[Status], plus lookup methods.
Give the above Status enum, here's what the generated code looks like:
scala> #enum(debug = true) class Status {
| val Enabled, Disabled = Value
| }
{
sealed abstract class Status(val index: Int, val name: String)(implicit sealant: Status.Sealant);
object Status {
#scala.annotation.implicitNotFound(msg = "Enum types annotated with ".+("#enum can not be extended directly. To add another value to the enum, ").+("please adjust your `def ... = Value` declaration.")) sealed abstract protected class Sealant;
implicit protected object Sealant extends Sealant;
case object Enabled extends Status(0, "Enabled") with scala.Product with scala.Serializable;
case object Disabled extends Status(1, "Disabled") with scala.Product with scala.Serializable;
val values: List[Status] = List(Enabled, Disabled);
val fromIndex: _root_.scala.Function1[Int, Status] = Map(Enabled.index.->(Enabled), Disabled.index.->(Disabled));
val fromName: _root_.scala.Function1[String, Status] = Map(Enabled.name.->(Enabled), Disabled.name.->(Disabled));
def switch[A](pf: PartialFunction[Status, A]): _root_.scala.Function1[Status, A] = macro numerato.SwitchMacros.switch_impl[Status, A]
};
()
}
defined class Status
defined object Status

How to model schema.org in Scala?

Schema.org is markup vocabulary (for the web) and defines a number of types in terms of properties (no methods). I am currently trying to model parts of that schema in Scala as internal model classes to be used in conjunction with a document-oriented database (MongoDB) and a web framework.
As can be seen in the definition of LocalBusiness, schema.org uses multiple inheritance to also include properties from the "Place" type. So my question is: How would you model such a schema in Scala?
I have come up with two solutions so far. The first one use regular classes to model a single inheritance tree and uses traits to mixin those additional properties.
trait ThingA {
var name: String = ""
var url: String = ""
}
trait OrganizationA {
var email: String = ""
}
trait PlaceA {
var x: String = ""
var y: String = ""
}
trait LocalBusinessA {
var priceRange: String = ""
}
class OrganizationClassA extends ThingA with OrganizationA {}
class LocalBusinessClassA extends OrganizationClassA with PlaceA with LocalBusinessA {}
The second version tries to use case classes. However, since case class inheritance is deprecated, I cannot model the main hierarchy so easily.
trait ThingB {
val name: String
}
trait OrganizationB {
val email: String
}
trait PlaceB {
val x: String
val y: String
}
trait LocalBusinessB {
val priceRange: String
}
case class OrganizationClassB(val name: String, val email: String) extends ThingB with OrganizationB
case class LocalBusinessClassB(val name: String, val email: String, val x: String, val y: String, val priceRange: String) extends ThingB with OrganizationB with PlaceB with LocalBusinessB
Is there a better way to model this? I could use composition similar to
case class LocalBusinessClassC(val thing:ThingClass, val place: PlaceClass, ...)
but then of course, LocalBusiness cannot be used when a "Place" is expected, for example when I try to render something on Google Maps.
What works best for you depends greatly on how you want to map your objects to the underlying datastore.
Given the need for multiple inheritance, and approach that might be worth considering would be to just use traits. This gives you multiple inheritance with the least amount of code duplication or boilerplating.
trait Thing {
val name: String // required
val url: Option[String] = None // reasonable default
}
trait Organization extends Thing {
val email: Option[String] = None
}
trait Place extends Thing {
val x: String
val y: String
}
trait LocalBusiness extends Organization with Place {
val priceRange: String
}
Note that Organization extends Thing, as does Place, just as in schema.org.
To instantiate them, you create anonymous inner classes that specify the values of all attributes.
object UseIt extends App {
val home = new Place {
val name = "Home"
val x = "-86.586104"
val y = "34.730369"
}
val oz = new Place {
val name = "Oz"
val x = "151.206890"
val y = "-33.873651"
}
val paulis = new LocalBusiness {
val name = "Pauli's"
override val url = "http://www.paulisbarandgrill.com/"
val x = "-86.713660"
val y = "34.755092"
val priceRange = "$$$"
}
}
If any fields have a reasonable default value, you can specify the default value in the trait.
I left fields without value as empty strings, but it probably makes more sense to make optional fields of type Option[String], to better indicate that their value is not set. You liked using Option, so I'm using Option.
The downside of this approach is that the compiler generates an anonymous inner class every place you instantiate one of the traits. This could give you an explosion of .class files. More importantly, though, it means that different instances of the same trait will have different types.
Edit:
In regards to how you would use this to load objects from the database, that depends greatly on how you access your database. If you use an object mapper, you'll want to structure your model objects in the way that the mapper expects them to be structured. If this sort of trick works with your object mapper, I'll be surprised.
If you're writing your own data access layer, then you can simply use a DAO or repository pattern for data access, putting the logic to build the anonymous inner classes in there.
This is just one way to structure these objects. It's not even the best way, but it demonstrates the point.
trait Database {
// treats objects as simple key/value pairs
def findObject(id: String): Option[Map[String, String]]
}
class ThingRepo(db: Database) {
def findThing(id: String): Option[Thing] = {
// Note that in this way, malformed objects (i.e. missing name) simply
// return None. Logging or other responses for malformed objects is left
// as an exercise :-)
for {
fields <- db.findObject(id) // load object from database
name <- field.get("name") // extract required field
} yield {
new Thing {
val name = name
val url = field.get("url")
}
}
}
}
There's a bit more to it than that (how you identify objects, how you store them in the database, how you wire up repository, how you'll handle polymorphic queries, etc.). But this should be a good start.

Define custom serialization with Casbah / Salat - or delegate serialization to member?

I'm in the process of learning Scala for a new project having come from Rails. I've defined a type that is going to be used in a number of my models which can basically be thought of as collection of 'attributes'. It's basically just a wrapper for a hashmap that delegates most of its responsibilities to it:
case class Description(attributes: Map[String, String]) {
override def hashCode: Int = attributes.hashCode
override def equals(other: Any) = other match {
case that: Description => this.attributes == that.attributes
case _ => false
}
}
So I would then define a model class with a Description, something like:
case class Person(val name: String, val description: Description)
However, when I persist a Person with a SalatDAO I end up with a document that looks like this:
{
name : "Russell",
description:
{
attributes:
{
hair: "brown",
favourite_color: "blue"
}
}
}
When in actual fact I don't need the nesting of the attributes tag in the description tag - what I actually want is this:
{
name : "Russell",
description:
{
hair: "brown",
favourite_color: "blue"
}
}
I haven't tried, but I reckon I could get that to work if I made Description extend a Map rather than contain one, but I'd rather not, because a Description isn't a type of Map, it's something which has some of the behaviour of a Map as well as other behaviour of its own I'm going to add later. Composition over inheritance and so on.
So my question is, how can I tell Salat (or Casbah, I'm actually a bit unclear as to which is doing the conversion as I've only just started using them) how to serialize and deserialize the Description class? In the casbah tutorial here it says:
It is also possible to create your own custom type serializers and
deserializers. See Custom Serializers and Deserializers.
But this page doesn't seem to exist. Or am I going about it the wrong way? Is there actually a really simple way to indicate this is what I want to happen, an annotation or something? Or can I simply delegate the serialization to the attributes map in some way?
EDIT: After having a look at the source for the JodaTime conversion helper I've tried the following but have had no luck getting it to work yet:
import org.bson.{ BSON, Transformer }
import com.mongodb.casbah.commons.conversions.MongoConversionHelper
object RegisterCustomConversionHelpers extends Serializers
with Deserializers {
def apply() = {
super.register()
}
}
trait Serializers extends MongoConversionHelper
with DescriptionSerializer {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait Deserializers extends MongoConversionHelper {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait DescriptionSerializer extends MongoConversionHelper {
private val transformer = new Transformer {
def transform(o: AnyRef): AnyRef = o match {
case d: Description => d.attributes.asInstanceOf[AnyRef]
case _ => o
}
}
override def register() = {
BSON.addEncodingHook(classOf[Description], transformer)
super.register()
}
}
When I call RegisterCustomConversionHelpers() then save a Person I don't get any errors, it just has no effect, saving the document the same way as ever. This also seems like quite a lot to have to do for what I want.
Salat maintainer here.
I don't understand the value of Description as a wrapper here. It wraps a map of attributes, overrides the default equals and hashcode impl of a case class - which seems unnecessary since the impl is delegated to the map anyhow and that is exactly what the case class does anyway - and introduces an additional layer of indirection to the serialized object.
Have you considered just:
case class Person(val name: String, val description: Map[String, String])
This will do exactly what you want out of box.
In another situation I would recommend a simple type alias but unfortunately Salat can't support type aliases right now due to some issues with how they are depicted in pickled Scala signatures.
(You probably omitted this from your example from brevity, but it is best practice for your Mongo model to have an _id field - if you don't, the Mongo Java driver will supply one for you)
There is a working example of a custom BSON hook in the salat-core test package (it handles java.net.URL). It could be that your hook is not working simply because you are not registering it in the right place? But still, I would recommend getting rid of Description unless it is adding some value that is not evident from your example above.
Based on #prasinous' answer I decided this wasn't going to be that easy so I've changed my design a bit to the following, which pretty much gets me what I want. Rather than persisting the Description as a field I persist a vanilla map then mix in a Described trait to the model classes I want to have a description, which automatically converts the map to Description when the object is created. Would appreciate it if anyone can point out any obvious problems to this approach or any suggestions for improvement.
class Description(val attributes: Map[String, String]){
//rest of class omitted
}
trait Described {
val attributes: Map[String, String]
val description = new Description(attributes)
}
case class Person(name: String, attributes: Map[String, String]) extends Described