Can I have Nullable parameters in Scala object.
Similar to Nullable in C# and the syntactic sugar:
public int? Age {get; set;}
public string Name {get; set;}
public bool? isAdmin {get; set;}
Can I create scala class where the object can have some of these attributes set while others not?
I intend to have multiple constructors for different purposes. I want to ensure a null value implies the field is not set.
UPDATE
for example I have the following case class
case class Employee(age: Int, salary: Int, scaleNum : Int,
name: String, memberId: Int )
I am using GSON to serialize the class in JSON.
In some cases however, I don't want to pass values for some of the parameters such that the GSON serializer will only include the parameters that have non-NULL value.
How can I achieve that with Options or otherwise?
A common declaration in Scala for your example would be:
case class Employee(
age: Option[Int], // For int? Age
name: String // For string Name
// your other decls ...
)
Then you can use the type easily:
val john = Employee( age = Some(10), name = "John" )
While Scala 2 allows null values for references types (like String, etc) it is slowly changing starting with Scala 3 (https://dotty.epfl.ch/docs/reference/other-new-features/explicit-nulls.html).
JSON support
Java libraries (like GSON) don't know anything about Scala, so you should consider using other libraries for JSON support that support Scala:
circe
play json
jsoniter-scala
uPickle
etc
Those libraries not just aware of Option[] in your class definitions, but also have improved support for Scala collections, implicits, default values and other Scala language features.
It is really important to choose an appropriate library for this, because with the Java JSON libs you will end up with Java-style classes and code, compatibility issues with other Scala libs.
With the Circe your example would be:
import io.circe._
import io.circe.syntax._
import io.circe.parser._
import io.circe.generic.auto._
val john = Employee( age = Some(10), name = "John" )
val johnAsJson = john.asJson.dropNullValues
decode[Employee]( johnAsJson ) match {
case Right(johnBack) => ??? // johnBack now is the same as john
case Left(err) => ??? // handle JSON parse exceptions
}
Null coalescing operator
Now you might be looking where is the Null Coalescing Operator (?? in C#, ? in Kotlin, ...) in Scala.
The direct answer - there is none in the language itself. In Scala we work with Option (and other monadic structures, or ADT) in FP way.
That means, for example:
case class Address(zip : Option[String])
case class Employee(
address: Option[Address]
)
val john = Employee( address = Some( Address( zip = Some("1111A") )) )
you should avoid this style:
if (john.address.isDefined) {
if(john.address.zip.isDefined) {
...
}
}
you can use map/flatMaps instead:
john.address.flatMap { address =>
address.zip.map { zip =>
// your zip and address value
??
}
}
// or if you need only zip in a short form:
john.address.flatMap(_.zip).map { zip =>
// your zip value
??
}
or for-comprehension syntax (which is based on the same flatMaps):
for {
address <- john.address
zip <- address.zip
}
yield {
// doing something with zip and address
??
}
The important part is that idiomatic way to solve this in Scala mostly based on patterns from FP.
Related
I need to check if all value is null in a Scala class (Test.class) defined as:
class Test extends Serializable {
a: String = _
b: String = _
c: String = _
}
I can't change this class (is legacy code in my project).
How can i do this without a lot of if? The real class have 22 fields.
I've tried to use java reflection to detect the defined fields, but they are all defined every and I can't access it because the fields are private for the reflectors. (criteria.getClass.getDeclaredFields)
Thanks #Dmytro Mitin , it works for me. I think that I will use this approach for check the not null element:
testClass.getClass.getDeclaredFields.flatMap(f => {
f.setAccessible(true)
Option(f.get(criteria))
}).length
There are case classes which are already used in the project.
These classes are used in the slick-mapping too. And these classes extends some additional traits.
I don't want to generate all these classes from *.proto description.
Is there an opportunity to extend them in protobuf?
Or should I use a wrappers for them. And these wrappers will be described in *.proto and generated from it.
For proto definition
message PBPerson {
int64 id = 1;
string name = 2;
google.protobuf.StringValue phone = 3;
repeated string hobbies = 4;
}
and scala case class definition
case class Person(
id: Long,
name: String,
phone: Option[String],
hobbies: Seq[String])
you can use https://github.com/changvvb/scala-protobuf-java
import pbconverts.{ Protoable, Scalable }
val convertedPBPerson:PBPerson = Protoable[Person,PBPerson].toProto(person)
val convertedPerson:Person = Scalable[Person,PBPerson].toScala(pbPerson)
Additionally, this lib uses scala macro to ensure it's a type-safe conversion.
In Java, I do a lot of data integration work. One thing that comes up all the time is mapping data between multiple systems. So i'm constantly doing things like this
public enum DataField{
Field1("xmlField", "dbField", "system1Field";
private String xml;
private String db;
private String sys;
private DataField(String xml, String db, String sys){
this.xml = xml;
this.db = db;
this.sys = sys;
}
public getXml(){
return this.xml;
}
public static DataField valueOfXml(String xml){
for (DataField d : this.values()){
if (d.xml.equals(xml)){ return d;}
}
}
bla, bla bla
}
What this allows me to do is put the field name DataField in all my messaging and be able to map what that field is called in multiple systems. So in my XML, it may be firstname, in my database, it may be called first_name but in my external interface system, it may be called first. This pattern pulls all of that together very nicely and makes messaging in these types of systems very easy in a tight, type safe way.
Now I don't remember why Scala changed the enumeration implementation but I remember it made sense when I read it. But the question is, what would I use in Scala to replace this design pattern? I hate to lose it because it is very useful and fundamental to a lot of systems I write on a given day.
thanks
I managed to make up this kind-of replacement for your case:
sealed class DataField(val xml: String, val db: String, val sys: String)
object DataField {
case object Field1 extends DataField("xmlField1", "dbField1", "system1Field")
case object Field2 extends DataField("xmlField2", "dbField2", "system2Field")
case object Field3 extends DataField("xmlField3", "dbField3", "system3Field")
val values = List(Field1, Field2, Field3)
def valueOfXml(xml: String) =
values.find(_.xml == xml).get
}
The annoying thing is that we have to manually create the values list. In this case however, we can do some macro hacking to reduce the boilerplate a bit:
import scala.language.experimental.macros
import scala.reflect.macros.Context
object Macros {
def caseObjectsFor[T]: List[T] = macro caseObjectsFor_impl[T]
def caseObjectsFor_impl[T: c.WeakTypeTag](c: Context): c.Expr[List[T]] = {
import c.universe._
val baseClassSymbol = weakTypeOf[T].typeSymbol.asClass
val caseObjectSymbols = baseClassSymbol.knownDirectSubclasses.toList.collect {
case s if s.isModuleClass && s.asClass.isCaseClass => s.asClass.module
}
val listObjectSym = typeOf[List.type].termSymbol
c.Expr[List[T]](Apply(Ident(listObjectSym), caseObjectSymbols.map(s => Ident(s))))
}
}
Then we can do this:
val values = Macros.caseObjectsFor[DataField]
instead of manually listing all the case objects.
For this to work, it is essential that the base class is declared as sealed.
You could always do what I do, and keep writing the enums in Java.
Out of 62 .java files in my source tree, 61 are enums, and the other one is a package-info.java.
I'd like to read a string as an instance of a case class. For example, if the function were named "read" it would let me do the following:
case class Person(name: String, age: Int)
val personString: String = "Person(Bob,42)"
val person: Person = read(personString)
This is the same behavior as the read typeclass in Haskell.
dflemstr answered more towards setting up the actual read method- I'll answer more for the actual parsing method.
My approach has two objects that can be used in scala's pattern matching blocks. AsInt lets you match against strings that represent Ints, and PersonString is the actual implementation for Person deserialization.
object AsInt {
def unapply(s: String) = try{ Some(s.toInt) } catch {
case e: NumberFormatException => None
}
}
val PersonRegex = "Person\\((.*),(\\d+)\\)".r
object PersonString {
def unapply(str: String): Option[Person] = str match {
case PersonRegex(name, AsInt(age)) => Some(Person(name, age))
case _ => None
}
}
The magic is in the unapply method, which scala has syntax sugar for. So using the PersonString object, you could do
val person = PersonString.unapply("Person(Bob,42)")
// person will be Some(Person("Bob", 42))
or you could use a pattern matching block to do stuff with the person:
"Person(Bob,42)" match {
case PersonString(person) => println(person.name + " " + person.age)
case _ => println("Didn't get a person")
}
Scala does not have type classes, and in this case, you cannot even simulate the type class with a trait that is inherited from, because traits only express methods on an object, meaning that they have to be "owned" by a class, so you cannot put the definition of a "constructor that takes a string as the only argument" (which is what "read" might be called in OOP languages) in a trait.
Instead, you have to simulate type classes yourself. This is done like so (equivalent Haskell code in comments):
// class Read a where read :: String -> a
trait Read[A] { def read(s: String): A }
// instance Read Person where read = ... parser for Person ...
implicit object ReadPerson extends Read[Person] {
def read(s: String): Person = ... parser for Person ...
}
Then, when you have a method that depends on the type class, you have to specify it as an implicit context:
// readList :: Read a => [String] -> [a]
// readList ss = map read ss
def readList[A: Read] (ss: List[String]): List[A] = {
val r = implicitly[Read[A]] // Get the class instance of Read for type A
ss.map(r.read _)
}
The user would probably like a polymorphic method like this for ease of use:
object read {
def apply[A: Read](s: String): A = implicitly[Read[A]].read(s)
}
Then one can just write:
val person: Person = read[Person]("Person(Bob,42)")
I am not aware of any standard implementation(s) for this type class, in particular.
Also, a disclaimer: I don't have a Scala compiler and haven't used the language for years, so I can't guarantee that this code compiles.
Starting Scala 2.13, it's possible to pattern match a Strings by unapplying a string interpolator:
// case class Person(name: String, age: Int)
"Person(Bob,42)" match { case s"Person($name,$age)" => Person(name, age.toInt) }
// Person("Bob", 42)
Note that you can also use regexes within the extractor.
Which in this case, helps for instance to match on "Person(Bob, 42)" (age with a leading space) and to force age to be an integer:
val Age = "[ ?*](\\d+)".r
"Person(Bob, 42)" match {
case s"Person($name,${Age(age)})" => Some(Person(name, age.toInt))
case _ => None
}
// Person = Some(Person(Bob,42))
The answers on this question are somewhat outdated. Scala has picked up some new features, notably typeclasses and macros, to make this more easily possible.
Using the Scala Pickling library, you can serialize/deserialize arbitrary classes to and from various serialization formats:
import scala.pickling._
import json._
case class Person(name: String, age: Int)
val person1 = Person("Bob", 42)
val str = person1.pickle.value // { tpe: "Person", name: "Bob", age: 42 }
val person2 = JSONPickle(str).unpickle[Person]
assert(person1 == person2) // Works!
The serializers/deserializers are automatically generated at compile time, so no reflection! If you need to parse case classes using a specific format (such as the case class toString format), you can extend this system with your own formats.
The uPickle library offers a solution for this problem.
Scala uses Java's serialization stuff, with no String representation.
Schema.org is markup vocabulary (for the web) and defines a number of types in terms of properties (no methods). I am currently trying to model parts of that schema in Scala as internal model classes to be used in conjunction with a document-oriented database (MongoDB) and a web framework.
As can be seen in the definition of LocalBusiness, schema.org uses multiple inheritance to also include properties from the "Place" type. So my question is: How would you model such a schema in Scala?
I have come up with two solutions so far. The first one use regular classes to model a single inheritance tree and uses traits to mixin those additional properties.
trait ThingA {
var name: String = ""
var url: String = ""
}
trait OrganizationA {
var email: String = ""
}
trait PlaceA {
var x: String = ""
var y: String = ""
}
trait LocalBusinessA {
var priceRange: String = ""
}
class OrganizationClassA extends ThingA with OrganizationA {}
class LocalBusinessClassA extends OrganizationClassA with PlaceA with LocalBusinessA {}
The second version tries to use case classes. However, since case class inheritance is deprecated, I cannot model the main hierarchy so easily.
trait ThingB {
val name: String
}
trait OrganizationB {
val email: String
}
trait PlaceB {
val x: String
val y: String
}
trait LocalBusinessB {
val priceRange: String
}
case class OrganizationClassB(val name: String, val email: String) extends ThingB with OrganizationB
case class LocalBusinessClassB(val name: String, val email: String, val x: String, val y: String, val priceRange: String) extends ThingB with OrganizationB with PlaceB with LocalBusinessB
Is there a better way to model this? I could use composition similar to
case class LocalBusinessClassC(val thing:ThingClass, val place: PlaceClass, ...)
but then of course, LocalBusiness cannot be used when a "Place" is expected, for example when I try to render something on Google Maps.
What works best for you depends greatly on how you want to map your objects to the underlying datastore.
Given the need for multiple inheritance, and approach that might be worth considering would be to just use traits. This gives you multiple inheritance with the least amount of code duplication or boilerplating.
trait Thing {
val name: String // required
val url: Option[String] = None // reasonable default
}
trait Organization extends Thing {
val email: Option[String] = None
}
trait Place extends Thing {
val x: String
val y: String
}
trait LocalBusiness extends Organization with Place {
val priceRange: String
}
Note that Organization extends Thing, as does Place, just as in schema.org.
To instantiate them, you create anonymous inner classes that specify the values of all attributes.
object UseIt extends App {
val home = new Place {
val name = "Home"
val x = "-86.586104"
val y = "34.730369"
}
val oz = new Place {
val name = "Oz"
val x = "151.206890"
val y = "-33.873651"
}
val paulis = new LocalBusiness {
val name = "Pauli's"
override val url = "http://www.paulisbarandgrill.com/"
val x = "-86.713660"
val y = "34.755092"
val priceRange = "$$$"
}
}
If any fields have a reasonable default value, you can specify the default value in the trait.
I left fields without value as empty strings, but it probably makes more sense to make optional fields of type Option[String], to better indicate that their value is not set. You liked using Option, so I'm using Option.
The downside of this approach is that the compiler generates an anonymous inner class every place you instantiate one of the traits. This could give you an explosion of .class files. More importantly, though, it means that different instances of the same trait will have different types.
Edit:
In regards to how you would use this to load objects from the database, that depends greatly on how you access your database. If you use an object mapper, you'll want to structure your model objects in the way that the mapper expects them to be structured. If this sort of trick works with your object mapper, I'll be surprised.
If you're writing your own data access layer, then you can simply use a DAO or repository pattern for data access, putting the logic to build the anonymous inner classes in there.
This is just one way to structure these objects. It's not even the best way, but it demonstrates the point.
trait Database {
// treats objects as simple key/value pairs
def findObject(id: String): Option[Map[String, String]]
}
class ThingRepo(db: Database) {
def findThing(id: String): Option[Thing] = {
// Note that in this way, malformed objects (i.e. missing name) simply
// return None. Logging or other responses for malformed objects is left
// as an exercise :-)
for {
fields <- db.findObject(id) // load object from database
name <- field.get("name") // extract required field
} yield {
new Thing {
val name = name
val url = field.get("url")
}
}
}
}
There's a bit more to it than that (how you identify objects, how you store them in the database, how you wire up repository, how you'll handle polymorphic queries, etc.). But this should be a good start.