I am working on an backend server in Play Framework in Scala. However, I am calling an external library (written in java) that returns a Java list (util.List). I created a writes for the object that is contained in the List, however I don't know how to write the writes for the actual List so that it can be generic (no need to write a "writes" for both List and List, just the writes for A and B).
I know I could use JavaConversions to convert the Java list to a Scala Seq (that already has Writes implemented), but since speed is essential, I would like to not do the extra conversion.
Here is a possible implementation
import play.api.libs.json.{JsArray, JsValue, Json, Writes}
import scala.collection.JavaConverters._
implicit def jListWrites[A: Writes] = new Writes[java.util.List[A]] {
override def writes(o: util.List[A]): JsValue = {
JsArray(o.asScala.map(Json.toJson(_)))
}
}
You don't create a single Writes but rather a method that can create them for any type that has Writes defined.
You said you want to avoid JavaConversions, but as you can see it is difficult as JsArray expects a Seq[JsValue] anyway so you need to construct a scala Seq one way or another.
What I shown here is more or less equivalent to converting java List to scala mutable.Buffer using asScala and using default Writes for Traversable.
Note that conversions are probably not as expansive as you think, they just create a wrapper, no copying involved.
Here is the best what I could come with in terms of performance
implicit def jListWrites[A: Writes] = new Writes[java.util.List[A]] {
override def writes(o: util.List[A]): JsValue = {
val buffer = new Array[JsValue](o.size)
var i = 0
while (i < o.size) {
buffer(i) = Json.toJson(o.get(i))
i += 1
}
JsArray(buffer)
}
}
It takes 29 ms for 1000000 Ints compared to 39 ms for the straightforward implementation. Note that Int is easy to convert, if your objects are more complex the speedup will be smaller.
Converting 20000 of those case class C(num: Int, n2: Int, s: String) gives equal results (straightforward is even faster by 0.14 ms).
You can code a Writes that reuses the existing one for Scala List
import java.util.{ List => JList }
implicit def JListWrites[T](implicit sw: Writes[List[T]]): Writes[JList[T]]) = Write[JList[T]] { jlist =>
sw.writes(jlist.asScala)
}
Related
I have a bunch of Scala objects with def's that do a bunch of processing
Foo\CatProcessing (def processing)
Foo\DogProcessing (def processing)
Foo\BirdProcessing (def processing)
Then I have a my main def that will call all of the individual Foo\obj defProcessing. Passing in common parameter values and such
I am trying to put all the list of objects into an Array or List, and then do a 'Foreach' to loop through the list passing in the parameter values or such. ie
foreach(object in objList){
object.Processing(parametmers)
}
Coming from C#, I could do this via binders or the like, so who would I manage this is in Scala?
for (obj <- objList) {
obj.processing(parameters) // `object` is a reserved keyword in Scala
}
or
objList.foreach(obj => obj.processing(parameters))
They are actually the same thing, the former being "syntactic sugar" for the latter.
In the second case, you can bind the only parameter of the anonymous function passed to the foreach function to _, resulting in the following
objList.foreach(_.processing(parameters))
for comprehensions in Scala can be quite expressive and go beyond simple iteration, if you're curious you can read more about it here.
Since you are coming from C#, if by any chance you have had any exposure to LINQ you will find yourself at home with the Scala Collection API. The official documentation is quite extensive in this regard and you can read more about it here.
As it came up in the comments following my reply, you also need the objects you want to iterate to:
have a common type that
exposes the processing method
Alternatively, Scala allows to use structural typing but that relies on runtime reflection and it's unlikely something you really need or want in this case.
You can achieve it by having a common trait for your objects, as in the following example:
trait Processing {
def processing(): Unit
}
final class CatProcessing extends Processing {
def processing(): Unit = println("cat")
}
final class DogProcessing extends Processing {
def processing(): Unit = println("dog")
}
final class BirdProcessing extends Processing {
def processing(): Unit = println("bird")
}
val cat = new CatProcessing
val dog = new DogProcessing
val bird = new BirdProcessing
for (process <- List(cat, dog, bird)) {
process.processing()
}
You can run the code above and play around with it here on Scastie.
Using a Map instead, you can do it as such. (wonder if this works through other types of lists)
val test = Map("foobar" -> CatProcessing)
test.values.foreach(
(movie) => movie.processing(spark)
)
I have a quite big structure of case classes and somewhere deep inside this structure I have fields which I want to refine, for example, make lists non-empty. Is it possible to tell ScalaCheck to make those lists non-empty using automatic derivation from scalacheck-magnolia project (without providing each field specifically)?
Example:
import com.mrdziuban.ScalacheckMagnolia.deriveArbitrary
import org.scalacheck.Arbitrary
import org.scalacheck.Gen
case class A(b: B, c: C)
case class B(list: List[Long])
case class C(list: List[Long])
// I've tried:
def genNEL[T: Gen]: Gen[List[T]] = Gen.nonEmptyListOf(implicitly[Gen[T]])
implicit val deriveNEL = Arbitrary(genNEL)
implicit val deriveA = implicitly[Arbitrary[A]](deriveArbitrary)
But it's didn't worked out.
I'm not sure how to be generic, since I'm not familiar with getting automatic derivation for Arbitrary with scalacheck-magnolia. It seems like scalacheck-magnolia is good for deriving an Arbitrary for case classes, but maybe not for containers (lists, vectors, arrays, etc.).
If you want to just use plain ScalaCheck, you could just define the implicit Arbitrary for A yourself. Doing it by hand is some extra boilerplate, but it has the benefit that you have more control if you want to use different generators for different parts of your data structure.
Here's an example where an Arbitrary list of longs is non-empty by default, but is empty for B.
implicit val listOfLong =
Arbitrary(Gen.nonEmptyListOf(Arbitrary.arbitrary[Long]))
implicit val arbC = Arbitrary {
Gen.resultOf(C)
}
implicit val arbB = Arbitrary {
implicit val listOfLong =
Arbitrary(Gen.listOf(Arbitrary.arbitrary[Long]))
Gen.resultOf(B)
}
implicit val arbA = Arbitrary {
Gen.resultOf(A)
}
property("arbitrary[A]") = {
Prop.forAll { a: A =>
a.b.list.size >= 0 && a.c.list.size > 0
}
}
This question already has answers here:
Scalaz pipe operator connected with a list method
(2 answers)
Closed 4 years ago.
I find myself writing Scala programs more often recently.
I like to program in a style that often uses long method chains, but sometimes the transformation you want to apply is not a method of the object you want to transform. So I find myself defining:
class Better[T] (t: T){
def xform[U](func: T => U) = func(t)
}
implicit def improve[T](t: T) = new Better(t)
This allows my to write the chains I want, such as
val content = s3.getObject(bucket, key)
.getObjectContent
.xform(Source.fromInputStream)
.mkString
.toInt
Is there any similar facility already in the standard library? If so, how should I have discovered it without resorting to StackOverflow?
It's not the standard library, but it might be "standard enough": with Cats, you should be able to write something like
val content =
s3
.getObject(bucket, key)
.getObjectContent
.pure[Id].map(Source.fromInputStream)
.mkString
.toInt
where pure[Id] wraps the input value into the do-nothing Id monad, and then passes it as argument to Source.fromInputStream.
EDIT: This does not seem to work reliably. If the object already has a method map, then this method is called instead of Id.map.
Smaller example (just to demonstrate the necessary imports):
import cats.Id
import cats.syntax.applicative._
import cats.syntax.functor._
object Main {
def square(x: Int) = x * x
def main(args: Array[String]): Unit = {
println(42.pure[Id].map(square))
}
}
However, writing either
val content =
Source
.fromInputStream(
s3
.getObject(bucket, key)
.getObjectContent
)
.mkString
.toInt
or
val content =
Source
.fromInputStream(s3.getObject(bucket, key).getObjectContent)
.mkString
.toInt
does not require any extra dependencies, and frees you both from the burden of defining otherwise useless case classes, and also from the burden of reindenting your code every time you rename either content or s3.
It also shows how the expressions are actually nested, and what depends on what - there is a reason why the vast majority of mainstream programming languages of the past 50 years have a call-stack.
I am trying to prevent empty values being inserted into my mongoDB collection. The field in question looks like this:
MongoDB Field
"stadiumArr" : [
"Old Trafford",
"El Calderon",
...
]
Sample of (mapped) case class
case class FormData(_id: Option[BSONObjectID], stadiumArr: Option[List[String]], ..)
Sample of Scala form
object MyForm {
val form = Form(
mapping(
"_id" -> ignored(Option.empty[BSONObjectID]),
"stadiumArr" -> optional(list(text)),
...
)(FormData.apply)(FormData.unapply)
)
}
I am also using the Repeated Values functionality in Play Framework like so:
Play Template
#import helper._
#(myForm: Form[models.db.FormData])(implicit request: RequestHeader, messagesProvider: MessagesProvider)
#repeatWithIndex(myForm("stadiumArr"), min = 5) { (stadium, idx) =>
#inputText(stadium, '_label -> ("stadium #" + (idx + 1)))
}
This ensures that whether there are at least 5 values or not in the array; there will still be (at least) 5 input boxes created. However if one (or more) of the input boxes are empty when the form is submitted an empty string is still being added as value in the array, e.g.
"stadiumArr" : [
"Old Trafford",
"El Calderon",
"",
"",
""
]
Based on some other ways of converting types from/to the database; I've tried playing around with a few solutions; such as:
implicit val arrayWrite: Writes[List[String]] = new Writes[List[String]] {
def writes(list: List[String]): JsValue = Json.arr(list.filterNot(_.isEmpty))
}
.. but this isn't working. Any ideas on how to prevent empty values being inserted into the database collection?
Without knowing specific versions or libraries you're using it's hard to give you an answer, but since you linked to play 2.6 documentation I'll assume that's what you're using there. The other assumption I'm going to make is that you're using reactive-mongo library. Whether or not you're using the play plugin for that library or not is the reason why I'm giving you two different answers here:
In that library, with no plugin, you'll have defined a BSONDocumentReader and a BSONDocumentWriter for your case class. This might be auto-generated for you with macros or not, but regardless how you get it, these two classes have useful methods you can use to transform the reads/writes you have to another one. So, let's say I defined a reader and writer for you like this:
import reactivemongo.bson._
case class FormData(_id: Option[BSONObjectID], stadiumArr: Option[List[String]])
implicit val formDataReaderWriter = new BSONDocumentReader[FormData] with BSONDocumentWriter[FormData] {
def read(bson: BSONDocument): FormData = {
FormData(
_id = bson.getAs[BSONObjectID]("_id"),
stadiumArr = bson.getAs[List[String]]("stadiumArr").map(_.filterNot(_.isEmpty))
)
}
def write(formData: FormData) = {
BSONDocument(
"_id" -> formData._id,
"stadiumArr" -> formData.stadiumArr
)
}
}
Great you say, that works! You can see in the reads I went ahead and filtered out any empty strings. So even if it's in the data, it can be cleaned up. That's nice and all, but let's notice I didn't do the same for the writes. I did that so I can show you how to use a useful method called afterWrite. So pretend the reader/writer weren't the same class and were separate, then I can do this:
val initialWriter = new BSONDocumentWriter[FormData] {
def write(formData: FormData) = {
BSONDocument(
"_id" -> formData._id,
"stadiumArr" -> formData.stadiumArr
)
}
}
implicit val cleanWriter = initialWriter.afterWrite { bsonDocument =>
val fixedField = bsonDocument.getAs[List[String]]("stadiumArr").map(_.filterNot(_.isEmpty))
bsonDocument.remove("stadiumArr") ++ BSONDocument("stadiumArr" -> fixedField)
}
Note that cleanWriter is the implicit one, that means when the insert call on the collection happens, it will be the one chosen to be used.
Now, that's all a bunch of work, if you're using the plugin/module for play that lets you use JSONCollections then you can get by with just defining play json Reads and Writes. If you look at the documentation you'll see that the reads trait has a useful map function you can use to transform one Reads into another.
So, you'd have:
val jsonReads = Json.reads[FormData]
implicit val cleanReads = jsonReads.map(formData => formData.copy(stadiumArr = formData.stadiumArr.map(_.filterNot(_.isEmpty))))
And again, because only the clean Reads is implicit, the collection methods for mongo will use that.
NOW, all of that said, doing this at the database level is one thing, but really, I personally think you should be dealing with this at your Form level.
val form = Form(
mapping(
"_id" -> ignored(Option.empty[BSONObjectID]),
"stadiumArr" -> optional(list(text)),
...
)(FormData.apply)(FormData.unapply)
)
Mainly because, surprise surprise, form has a way to deal with this. Specifically, the mapping class itself. If you look there you'll find a transform method you can use to filter out empty values easily. Just call it on the mapping you need to modify, for example:
"stadiumArr" -> optional(
list(text).transform(l => l.filter(_.nonEmpty), l => l.filter(_.nonEmpty))
)
To explain a little more about this method, in case you're not used to reading the signatures in the scaladoc.
def
transform[B](f1: (T) ⇒ B, f2: (B) ⇒ T): Mapping[B]
says that by calling transform on some mapping of type Mapping[T] you can create a new mapping of type Mapping[B]. In order to do this you must provide functions that convert from one to the other. So the code above causes the list mapping (Mapping[List[String]]) to become a Mapping[List[String]] (the type did not change here), but when it does so it removes any empty elements. If I break this code down a little it might be more clear:
def convertFromTtoB(list: List[String]): List[String] = list.filter(_.nonEmpty)
def convertFromBtoT(list: List[String]): List[String] = list.filter(_.nonEmpty)
...
list(text).transform(convertFromTtoB, convertFromBtoT)
You might wondering why you need to provide both, the reason is because when you call Form.fill and the form is populated with values, the second method will be called so that the data goes into the format the play form is expecting. This is more obvious if the type actually changes. For example, if you had a text area where people could enter CSV but you wanted to map it to a form model that had a proper List[String] you might do something like:
def convertFromTtoB(raw: String): List[String] = raw.split(",").filter(_.nonEmpty)
def convertFromBtoT(list: List[String]): String = list.mkString(",")
...
text.transform(convertFromTtoB, convertFromBtoT)
Note that when I've done this in the past sometimes I've had to write a separate method and just pass it in if I didn't want to fully specify all the types, but you should be able to work from here given the documentation and type signature for the transform method on mapping.
The reason I suggest doing this in the form binding is because the form/controller should be the one with the concern of dealing with your user data and cleaning things up I think. But you can always have multiple layers of cleaning and whatnot, it's not bad to be safe!
I've gone for this (which always seems obvious when it's written and tested):
implicit val arrayWrite: Writes[List[String]] = new Writes[List[String]] {
def writes(list: List[String]): JsValue = Json.toJson(list.filterNot(_.isEmpty).toIndexedSeq)
}
But I would be interested to know how to
.map the existing Reads rather than redefining from scratch
as #cchantep suggests
Say I have a local method/function
def withExclamation(string: String) = string + "!"
Is there a way in Scala to transform an instance by supplying this method? Say I want to append an exclamation mark to a string. Something like:
val greeting = "Hello"
val loudGreeting = greeting.applyFunction(withExclamation) //result: "Hello!"
I would like to be able to invoke (local) functions when writing a chain transformation on an instance.
EDIT: Multiple answers show how to program this possibility, so it seems that this feature is not present on an arbitraty class. To me this feature seems incredibly powerful. Consider where in Java I want to execute a number of operations on a String:
appendExclamationMark(" Hello! ".trim().toUpperCase()); //"HELLO!"
The order of operations is not the same as how they read. The last operation, appendExclamationMark is the first word that appears. Currently in Java I would sometimes do:
Function.<String>identity()
.andThen(String::trim)
.andThen(String::toUpperCase)
.andThen(this::appendExclamationMark)
.apply(" Hello "); //"HELLO!"
Which reads better in terms of expressing a chain of operations on an instance, but also contains a lot of noise, and it is not intuitive to have the String instance at the last line. I would want to write:
" Hello "
.applyFunction(String::trim)
.applyFunction(String::toUpperCase)
.applyFunction(this::withExclamation); //"HELLO!"
Obviously the name of the applyFunction function can be anything (shorter please). I thought backwards compatibility was the sole reason Java's Object does not have this.
Is there any technical reason why this was not added on, say, the Any or AnyRef classes?
You can do this with an implicit class which provides a way to extend an existing type with your own methods:
object StringOps {
implicit class RichString(val s: String) extends AnyVal {
def withExclamation: String = s"$s!"
}
def main(args: Array[String]): Unit = {
val m = "hello"
println(m.withExclamation)
}
}
Yields:
hello!
If you want to apply any functions (anonymous, converted from methods, etc.) in this way, you can use a variation on Yuval Itzchakov's answer:
object Combinators {
implicit class Combinators[A](val x: A) {
def applyFunction[B](f: A => B) = f(x)
}
}
A while after asking this question, I noticed that Kotlin has this built in:
inline fun <T, R> T.let(block: (T) -> R): R
Calls the specified function block with this value as its argument and returns
its result.
A lot more, quite useful variations of the above function are provided on all types, like with, also, apply, etc.