Implement implicit Writes for result of join query - scala

Compilation error:
No Json serializer found for type Seq[(models.Account, models.Company)]. Try to implement an implicit Writes or Format for this type.
How can I define an implicit writes for the result of a join query?
Controller:
def someEndpoint = Action.async { implicit request =>
val query = for {
a <- accounts if a.id === 10
c <- companies if a.companyID === c.id
} yield (a, c)
db.run(query.result).map(rows => Ok(Json.toJson(rows))) // Causes compilation error
}
Each of my models (account and company) have their own implicit writes (here's the company one):
case class Company(id: Int, name: String)
object Company {
implicit val writes = new Writes[Company] {
def writes(company: Company): JsValue = {
Json.obj(
"id" -> company.id,
"name" -> company.name
)
}
}
}
Is it possible to dynamically handle serializations for joins? I have a lot of things I will be joining together... Do I need to explicitly define a writes for each combination?

Writes.seq will help you
small writer
val w = (
(__ \ "account").write[Account] and
(__ \ "company").write[Company]
).tupled
helps you can transform Seq[(models.Account, models.Company)] to JsValue with
Writes.seq(w).writes(rows)
and last command will be
db.run(query.result).map(rows => Ok(Writes.seq(w).writes(rows))
or a more clear variant
db.run(query.result)
.map(
_.map{
case (a,c) => Json.obj("account" -> a, "company" -> c)
}
)
.map(rows =>
Ok(JsArray(rows))
)
it's the same thing, but you create object for every row yourself.

I think you expect the response of your query in JSON to be something like
[
{
"account" : { "number": "123", "companyID" : 1 },
"company" : { "id" : 1, "name" : "My company"}
} , ...
]
The problem is the response of the query is just a tuple, so "account" and "company" are not easy to calculate.
Instead of a tuple you could create a new case class with the joined data, but I understand you want to avoid that. In that case you can instead of a tuple use a Map that is something that will convert automatically to JSON.
Extra: Creating writers for case classes is very simple
import play.api.libs.json._
implicit val personWrites = Json.writes[Person]
Reference: https://www.playframework.com/documentation/2.4.x/ScalaJsonInception

Related

Serializing a tree in Play for Scala

I have the following class that's a tree node:
case class Node[A] ( id: Int, data: A, var children: Option[Seq[Node[A]]],
var parent: Option[Node[A]] )
where id is the node id number, data represents information stored in the node, children is the list of children nodes, and parent is the parent node.
I want to generate a json with the tree, so I wrote the following implicit val:
implicit val nodeWrite : Writes[Node[Data]] = (
(JsPath \ "sk").write[Int] and
(JsPath \ "dat").write[Data] and
(JsPath \ "ch").write[Option[Seq[Node[Data]]]] and
(JsPath \ "par").write[Option[Node[Data]]]
) (unlift(Node[Data].unapply))
However the compiler complains:
missing argument list for method apply in object Node Unapplied
methods are only converted to functions when a function type is
expected. You can make this conversion explicit by writing apply _ or
apply(,,,) instead of apply.
How to fix this?
UPDATE
Data is defined as:
case class Data (descrip: String)
UPDATE 2
Since I needed a tree with N roots, I created a Tree class containing a sequence of nodes.
case class Tree[A] ( var nodes: Option[Seq[Node[A]]] )
However I have a problem with serializing the tree:
implicit val treeWrite : Writes[Tree[Data]] = new Writes[Tree[Data]] {
def writes(x: Tree[Data]) = {
Json.obj(
"nodes" -> x.nodes.map(_.map(n => writes(n)))
)
}
}
it throws
type mismatch; found : Option[Nothing] required:
play.api.libs.json.Json.JsValueWrapper
in the x.nodes.map line.
I don't have complete answer, but you can help compiler by specifying types:
(unlift[Node[Data],(Int, Data, Option[Seq[Node[Data]]], Option[Node[Data]])]
(Node.unapply[Data](_)))
But it doesn't help you, since you have to use recursive types with lazyWrite. I would suggest use here more explicit approach:
implicit val nodeWrite : Writes[Node[Data]] = new Writes[Node[Data]] {
def writes(x: Node[Data]) = {
Json.obj(
"id" -> x.id,
"data" -> x.data,
"children" -> x.children.map(_.map(n => writes(n))),
"parent" -> x.parent.map(n => writes(n)))
}
}
val child = Node(1, "child", None, None)
val node = Node(1, "data", Some(List(child)), None)
Json toJson node
res0: play.api.libs.json.JsValue = {"id":1,"data":"data",
"children":[{"id":1,"data":"child","children":null,"parent":null}],"parent":null}
Add null handling and you will be fine.

Scala/Play/Squeryl Retrieve multiple params

I have the following url : http://localhost/api/books/?bookId=21&bookId=62?authorId=2
I want to retrieve all the bookId values with Scala and then use Squeryl to do a fetch in a the database.
I'm using the PlayFrameWork as the WebServer, so here's my code :
val params = request.queryString.map { case (k, v) => k -> v(0) } // Retrieve only one the first occurence of a param
So params.get("bookId") will only get the last value in the bookId params. e-g : 62.
To retrieve all my bookId params i tried this :
val params = request.queryString.map { case (k, v) => k -> v } so i can get a Seq[String], but what about the authorId which is not a Seq[String]? .
At the end i want to fetch the bookIds and authorId in my DB using Squeryl :
(a.author_id === params.get("authorId").?) and
(params.get("bookId").map(bookIds: Seq[String] => b.bookId in bookIds))
In my controller i get the params and open the DB connection :
val params = request.queryString.map { case (k, v) => k -> v(0) }
DB.withTransaction() { where(Library.whereHelper(params)}
In my model i use the queries :
def whereHelper(params : Map[String,String]) = {
(a.author_id === params.get("authorId").?) and
(params.get("bookId").map{bookIds: Seq[String] => b.bookId in bookIds})
}
Since bookIds is a list, i need to use the Seq[String]. There's a way to use request.queryString.map { case (k, v) => k -> v } for both a string (authorId) and a list of strings (bookIds) ?
Thanks,
If I really understand what you are trying to do, you want to know how to get the parameters from queryString. This is pretty simple and you can do the following at your controller:
def myAction = Action { request =>
// get all the values from parameter named bookId and
// transforming it to Long. Maybe you don't want the map
// and then you can just remove it.
val bookIds: Seq[Long] = request.queryString("bookId").map(_.toLong)
// Notice that now I'm using getQueryString which is a helper
// method to access a queryString parameter. It returns an
// Option[String] which we are mapping to a Option[Long].
// Again, if you don't need the mapping, just remove it.
val authorId: Option[Long] = request.getQueryString("authorId").map(_.toLong)
DB.withTransaction() { where(Library.whereHelper(authorId, bookIds) }
// Do something with the result
}
At your model you will have:
def whereHelper(authorId: Option[Long], booksId: List[Long]) = authorId match {
case Some(author_id) =>
(a.author_id === author_id) and
(b.bookId in bookIds)
case None =>
(b.bookId in bookIds)
}
I've left explicit types to help you understand what is happen. Now, since you have both values, you can just use the values at your query.
Edit after chat:
But, since you want to receive a params: Map[String, Seq[String]] at your models and is just having problems about how to get the authorId, here is what you can do:
def whereHelper(params: Map[String, Seq[String]]) = {
// Here I'm being defensive to the fact that maybe there is no
// "booksIds" key at the map. So, if there is not, an Seq.empty
// will be returned. map method will run only if there is something
// at the Seq.
val booksIds = params.getOrElse("booksIds", Seq.empty).map(_.toLong)
// The same defensive approach is being used here, and also getting
// the head as an Option, so if the Seq is empty, a None will be
// returned. Again, the map will be executed only if the Option
// is a Some, returning another Some with the value as a Long.
val authorId = params.getOrElse("authorId", Seq.empty).headOption
authorId.map(_.toLong) match {
case Some(author_id) =>
(a.author_id === author_id) and
(b.bookId in booksIds)
case None =>
(b.bookId in booksIds)
}
}
Of course, more parameters you have, more complicated this method will be.

What `JObject(rec) <- someJArray` means inside for-comprehension

I'm learning Json4s library.
I have a json fragment like this:
{
"records":[
{
"name":"John Derp",
"address":"Jem Street 21"
},
{
"name":"Scala Jo",
"address":"in my sweet dream"
}
]
}
And, I have Scala code, which converts a json string into a List of Maps, like this:
import org.json4s._
import org.json4s.JsonAST._
import org.json4s.native.JsonParser
val json = JsonParser.parse( """{"records":[{"name":"John Derp","address":"Jem Street 21"},{"name":"Scala Jo","address":"in my sweet dream"}]}""")
val records: List[Map[String, Any]] = for {
JObject(rec) <- json \ "records"
JField("name", JString(name)) <- rec
JField("address", JString(address)) <- rec
} yield Map("name" -> name, "address" -> address)
println(records)
The output of records to screen gives this:
List(Map(name -> John Derp, address -> Jem Street 21), Map(name ->
Scala Jo, address -> in my sweet dream))
I want to understand what the lines inside the for loop mean. For example, what is the meaning of this line:
JObject(rec) <- json \ "records"
I understand that the json \ "records" produces a JArray object, but why is it fetched as JObject(rec) at left of <-? What is the meaning of the JObject(rec) syntax? Where does the rec variable come from? Does JObject(rec) mean instantiating a new JObject class from rec input?
BTW, I have a Java programming background, so it would also be helpful if you can show me the Java equivalent code for the loop above.
You have the following types hierarchy:
sealed abstract class JValue {
def \(nameToFind: String): JValue = ???
def filter(p: (JValue) => Boolean): List[JValue] = ???
}
case class JObject(val obj: List[JField]) extends JValue
case class JField(val name: String, val value: JValue) extends JValue
case class JString(val s: String) extends JValue
case class JArray(val arr: List[JValue]) extends JValue {
override def filter(p: (JValue) => Boolean): List[JValue] =
arr.filter(p)
}
Your JSON parser returns following object:
object JsonParser {
def parse(s: String): JValue = {
new JValue {
override def \(nameToFind: String): JValue =
JArray(List(
JObject(List(
JField("name", JString("John Derp")),
JField("address", JString("Jem Street 21")))),
JObject(List(
JField("name", JString("Scala Jo")),
JField("address", JString("in my sweet dream"))))))
}
}
}
val json = JsonParser.parse("Your JSON")
Under the hood Scala compiler generates the following:
val res = (json \ "records")
.filter(_.isInstanceOf[JObject])
.flatMap { x =>
x match {
case JObject(obj) => //
obj //
.withFilter(f => f match {
case JField("name", _) => true
case _ => false
}) //
.flatMap(n => obj.withFilter(f => f match {
case JField("address", _) => true
case _ => false
}).map(a => Map(
"name" -> (n.value match { case JString(name) => name }),
"address" -> (a.value match { case JString(address) => address }))))
}
}
First line JObject(rec) <- json \ "records" is possible because JArray.filter returns List[JValue] (i.e. List[JObject]). Here each value of List[JValue] maps to JObject(rec) with pattern matching.
Rest calls are series of flatMap and map (this is how Scala for comprehensions work) with pattern matching.
I used Scala 2.11.4.
Of course, match expressions above are implemented using series of type checks and casts.
UPDATE:
When you use Json4s library there is an implicit conversion from JValue to org.json4s.MonadicJValue. See package object json4s:
implicit def jvalue2monadic(jv: JValue) = new MonadicJValue(jv)
This conversion is used here: JObject(rec) <- json \ "records". First, json is converted to MonadicJValue, then def \("records") is applied, then def filter is used on the result of def \ which is JValue, then it is again implicitly converted to MonadicJValue, then def filter of MonadicJValue is used. The result of MonadicJValue.filter is List[JValue]. After that steps described above are performed.
You are using a Scala for comprehension and I believe much of the confusion is about how for comprehensions work. This is Scala syntax for accessing the map, flatMap and filter methods of a monad in a concise way for iterating over collections. You will need some understanding of monads and for comprehensions in order to fully comprehend this. The Scala documentation can help, and so will a search for "scala for comprehension". You will also need to understand about extractors in Scala.
You asked about the meaning of this line:
JObject(rec) <- json \ "records"
This is part of the for comprehension.
Your statement:
I understand that the json \ "records" produces a JArray object,
is slightly incorrect. The \ function extracts a List[JSObject] from the parser result, json
but why is it fetched as JObject(rec) at left of <-?
The json \ "records" uses the json4s extractor \ to select the "records" member of the Json data and yield a List[JObject]. The <- can be read as "is taken from" and implies that you are iterating over the list. The elements of the list have type JObject and the construct JObject(rec) applies an extractor to create a value, rec, that holds the content of the JObject (its fields).
how come it's fetched as JObject(rec) at left of <-?
That is the Scala syntax for iterating over a collection. For example, we could also write:
for (x <- 1 to 10)
which would simply give us the values of 1 through 10 in x. In your example, we're using a similar kind of iteration but over the content of a list of JObjects.
What is the meaning of the JObject(rec)?
This is a Scala extractor. If you look in the json4s code you will find that JObject is defined like this:
case class JObject(obj: List[JField]) extends JValue
When we have a case class in Scala there are two methods defined automatically: apply and unapply. The meaning of JObject(rec) then is to invoke the unapply method and produce a value, rec, that corresponds to the value obj in the JObject constructor (apply method). So, rec will have the type List[JField].
Where does the rec variable come from?
It comes from simply using it and is declared as a placeholder for the obj parameter to JObject's apply method.
Does JObject(rec) mean instantiating new JObject class from rec input?
No, it doesn't. It comes about because the JArray resulting from json \ "records" contains only JObject values.
So, to interpret this:
JObject(rec) <- json \ "records"
we could write the following pseudo-code in english:
Find the "records" in the parsed json as a JArray and iterate over them. The elements of the JArray should be of type JObject. Pull the "obj" field of each JObject as a list of JField and assign it to a value named "rec".
Hopefully that makes all this a bit clearer?
it's also helpful if you can show me the Java equivalent code for the loop above.
That could be done, of course, but it is far more work than I'm willing to contribute here. One thing you could do is compile the code with Scala, find the associated .class files, and decompile them as Java. That might be quite instructive for you to learn how much Scala simplifies programming over Java. :)
why I can't do this? for ( rec <- json \ "records", so rec become JObject. What is the reason of JObject(rec) at the left of <- ?
You could! However, you'd then need to get the contents of the JObject. You could write the for comprehension this way:
val records: List[Map[String, Any]] = for {
obj: JObject <- json \ "records"
rec = obj.obj
JField("name", JString(name)) <- rec
JField("address", JString(address)) <- rec
} yield Map("name" -> name, "address" -> address)
It would have the same meaning, but it is longer.
I just want to understand what does the N(x) pattern mean, because I only ever see for (x <- y pattern before.
As explained above, this is an extractor which is simply the use of the unapply method which is automatically created for case classes. A similar thing is done in a case statement in Scala.
UPDATE:
The code you provided does not compile for me against 3.2.11 version of json4s-native. This import:
import org.json4s.JsonAST._
is redundant with this import:
import org.json4s._
such that JObject is defined twice. If I remove the JsonAST import then it compiles just fine.
To test this out a little further, I put your code in a scala file like this:
package example
import org.json4s._
// import org.json4s.JsonAST._
import org.json4s.native.JsonParser
class ForComprehension {
val json = JsonParser.parse(
"""{
|"records":[
|{"name":"John Derp","address":"Jem Street 21"},
|{"name":"Scala Jo","address":"in my sweet dream"}
|]}""".stripMargin
)
val records: List[Map[String, Any]] = for {
JObject(rec) <- json \ "records"
JField("name", JString(name)) <- rec
JField("address", JString(address)) <- rec
} yield Map("name" -> name, "address" -> address)
println(records)
}
and then started a Scala REPL session to investigate:
scala> import example.ForComprehension
import example.ForComprehension
scala> val x = new ForComprehension
List(Map(name -> John Derp, address -> Jem Street 21), Map(name -> Scala Jo, address -> in my sweet dream))
x: example.ForComprehension = example.ForComprehension#5f9cbb71
scala> val obj = x.json \ "records"
obj: org.json4s.JValue = JArray(List(JObject(List((name,JString(John Derp)), (address,JString(Jem Street 21)))), JObject(List((name,JString(Scala Jo)), (address,JString(in my sweet dream))))))
scala> for (a <- obj) yield { a }
res1: org.json4s.JValue = JArray(List(JObject(List((name,JString(John Derp)), (address,JString(Jem Street 21)))), JObject(List((name,JString(Scala Jo)), (address,JString(in my sweet dream))))))
scala> import org.json4s.JsonAST.JObject
for ( JObject(rec) <- obj ) yield { rec }
import org.json4s.JsonAST.JObject
scala> res2: List[List[org.json4s.JsonAST.JField]] = List(List((name,JString(John Derp)), (address,JString(Jem Street 21))), List((name,JString(Scala Jo)), (address,JString(in my sweet dream))))
So:
You are correct, the result of the \ operator is a JArray
The "iteration" over the JArray just treats the entire array as the only value in the list
There must be an implicit conversion from JArray to JObject that permits the extractor to yield the contents of JArray as a List[JField].
Once everything is a List, the for comprehension proceeds as normal.
Hope that helps with your understanding of this.
For more on pattern matching within assignments, try this blog
UPDATE #2:
I dug around a little more to discover the implicit conversion at play here. The culprit is the \ operator. To understand how json \ "records" turns into a monadic iterable thing, you have to look at this code:
org.json4s package object: This line declares an implicit conversion from JValue to MonadicJValue. So what's a MonadicJValue?
org.json4s.MonadicJValue: This defines all the things that make JValues iterable in a for comprehension: filter, map, flatMap and also provides the \ and \\ XPath-like operators
So, essentially, the use of the \ operator results in the following sequence of actions:
- implicitly convert the json (JValue) into MonadicJValue
- Apply the \ operator in MonadicJValue to yield a JArray (the "records")
- implicitly convert the JArray into MonadicJValue
- Use the MonadicJValue.filter and MonadicJValue.map methods to implement the for comprehension
Just simplified example, how for-comprehesion works here:
scala> trait A
defined trait A
scala> case class A2(value: Int) extends A
defined class A2
scala> case class A3(value: Int) extends A
defined class A3
scala> val a = List(1,2,3)
a: List[Int] = List(1, 2, 3)
scala> val a: List[A] = List(A2(1),A3(2),A2(3))
a: List[A] = List(A2(1), A3(2), A2(3))
So here is just:
scala> for(A2(rec) <- a) yield rec //will return and unapply only A2 instances
res34: List[Int] = List(1, 3)
Which is equivalent to:
scala> a.collect{case A2(rec) => rec}
res35: List[Int] = List(1, 3)
Collect is based on filter - so it's enough to have filter method as JValue has.
P.S. There is no foreach in JValue - so this won't work for(rec <- json \ "records") rec. But there is map, so that will: for(rec <- json \ "records") yield rec
If you need your for without pattern matching:
for {
rec <- (json \ "records").filter(_.isInstanceOf[JObject]).map(_.asInstanceOf[JObject])
rcobj = rec.obj
name <- rcobj if name._1 == "name"
address <- rcobj if address._1 == "address"
nm = name._2.asInstanceOf[JString].s
vl = address._2.asInstanceOf[JString].s
} yield Map("name" -> nm, "address" -> vl)
res27: List[scala.collection.immutable.Map[String,String]] = List(Map(name -> John Derp, address -> Jem Street 21), Map(name -> Scala Jo, address -> in my sweet dream))

Jerkson. Serializing map to JsValue

I am using built-in jerson with playframework 2, and all I want is to serialize map, containing values of different type:
object AppWriter extends Writes[Application] {
def writes(app: Application): JsValue = {
Json.toJson(Map(
"id" -> app.getId.toString,
"name" -> app.getName,
"users" -> Seq(1, 2, 3)
))
}
}
In this case I have:
No Json deserializer found for type scala.collection.immutable.Map[java.lang.String,java.lang.Object].
Try to implement an implicit Writes or Format for this type
Navigatin through framework code shows that there is serializer for Map[String,V] types implicit def mapWrites[V] .. , but I cannot understand why it doesn't applied.
Can anybody help me?
UPD: I found simple workaround:
object AppWriter extends Writes[Application] {
def writes(app: Application): JsValue = {
Json.toJson(Map[String, JsValue](
"id" -> JsNumber(BigDecimal(app.getId)),
"name" -> JsString(app.getName),
"users" -> JsArray(Seq(1, 2, 3).map(x => JsNumber(x)))
))
}
}
but this is not so elegant...
The standard way to do this is by creating a JsObject from the individual key-value pairs for the fields—not by putting the pairs into a map. For example, assuming your Application looks like this:
case class Application(getId: Int, getName: String)
You could write:
import play.api.libs.json._, Json.toJson
implicit object AppWriter extends Writes[Application] {
def writes(app: Application): JsValue = JsObject(
Seq(
"id" -> JsNumber(app.getId),
"name" -> JsString(app.getName),
"users" -> toJson(Seq(1, 2, 3))
)
)
}
And then:
scala> toJson(Application(1, "test"))
res1: play.api.libs.json.JsValue = {"id":1,"name":"test","users":[1,2,3]}
Note that you don't need to spell out how to serialize the Seq[Int]—the default Format instances will do the work for you.

Play! framework 2.0: Validate field in forms using other fields

In the play! framework, using scala,say that i have a form such as follows:
import play.api.data._
import play.api.data.Forms._
import play.api.data.validation.Constraints._
case class User(someStringField: String, someIntField: Int)
val userForm = Form(
mapping(
"someStringField" -> text,
"someIntField" -> number verifying(x => SomeMethodThatReceivesAnIntAndReturnsABoolean(x))
)(User.apply)(User.unapply)
)
where SomeMethodThatReceivesAnIntAndReturnsABoolean is a method that performs some logic on the int to validate it.
However, i would like to be able to consider the value of the someStringField when validating the someIntField, is there a way to achieve this in play framework's forms? I know that i can do something like:
val userForm = Form(
mapping(
"someStringField" -> text,
"someIntField" -> number
)(User.apply)(User.unapply)
.verifying(x => SomeFunctionThatReceivesAnUserAndReturnsABoolean(x))
and then i would have the entire user instance available passed to the validation function. The problem with that approach is that the resulting error would be associated with the entire form instead of being associated with the someIntField field.
Is there a way to get both things, validate a field using another field and maintain the error associated to the specific field i wish to validate, instead of the entire form?
I have the same requirements with adding validation to fields depending on the value of other fields. I´m not sure how this is done in idiomatic PLAY 2.2.1 but I came up with the following solution. In this usage I´m degrading the builtin "mapping" into a simple type converter and apply my "advanced inter field" validation in the "validateForm" method. The mapping:
val userForm = Form(
mapping(
"id" -> optional(longNumber),
"surename" -> text,
"forename" -> text,
"username" -> text,
"age" -> number
)(User.apply)(User.unapply)
)
private def validateForm(form:Form[User]) = {
if(form("username").value.get == "tom" || form("age").value.get == "38") {
form
.withError("forename", "tom - forename error")
.withError("surename", "tom - surename error")
}
else
form
}
def update = Action { implicit request =>
userForm.bindFromRequest.fold({
formWithErrors => BadRequest(users.edit(validateForm(formWithErrors)))
}, { user =>
val theForm = validateForm(userForm.fill(user))
if(theForm.hasErrors) {
BadRequest(users.edit(theForm))
} else {
Users.update(user)
Redirect(routes.UsersController.index).flashing("notice" -> s"${user.forename} updated!")
}
})
}
Even though it works I´m urgently searching for a more idiomatic version...
EDIT: Use a custom play.api.data.format.Formatter in idiomatic play, more on http://workwithplay.com/blog/2013/07/10/advanced-forms-techniques/ - this lets you programmatically add errors to a form. My Formatter looks like this:
val usernameFormatter = new Formatter[String] {
override def bind(key: String, data: Map[String, String]): Either[Seq[FormError], String] = {
// "data" lets you access all form data values
val age = data.get("age").get
val username = data.get("username").get
if(age == "66") {
Left(List(FormError("username", "invalid"), FormError("forename", "invalid")))
} else {
Right(username)
}
}
override def unbind(key: String, value: String): Map[String, String] = {
Map(key -> value)
}
}
}
Registered in the form mapping like this:
mapping(
[...]
"username" -> of(usernameFormatter),
[....]
I believe what you're looking for is play.api.data.validation.Constraint.
Say you have a RegisterForm with a list of predefined cities and an otherCity field and you need either the cities or otherCity to be supplied, i.e., otherCity should be validated if cities is not provided:
case class RegisterForm(
email: String,
password: String,
cities: Option[List[String]],
otherCity: Option[String]
)
You can write a custom Constraint around this:
val citiesCheckConstraint: Constraint[RegisterForm] = Constraint("constraints.citiescheck")({
registerForm =>
// you have access to all the fields in the form here and can
// write complex logic here
if (registerForm.cities.isDefined || registerForm.otherCity.isDefined) {
Valid
} else {
Invalid(Seq(ValidationError("City must be selected")))
}
})
And your form definition becomes:
val registerForm = Form(
mapping(
"email" -> nonEmptyText.verifying(emailCheckConstraint),
"password" -> nonEmptyText.verifying(passwordCheckConstraint),
"cities" -> optional(list(text)),
"other_city" -> optional(text)
)(RegisterForm.apply)(RegisterForm.unapply).verifying(citiesCheckConstraint)
)
In this example emailCheckConstraint and passwordCheckConstraint are additional custom constraints that I defined similar to citiesCheckConstraint. This works in Play 2.2.x.
UPDATE:
Works on Play 2.3.8 as well.
if you don't mind having a prefix for you params you can group the related params:
val aForm = Form(
mapping(
"prefix" -> tuple(
"someStringField" -> text,
"someIntField" -> number
) verifying (tup => your verification)
)(tup => User.apply(tup._1, tup._2)(User.unapply...)
I use something similar just without the surrounding mapping.
You will have to adjust the apply/unapply a little and pass the arguments manually for it to compile.
The error will be registered to the "prefix" group.
I also find it weird that you cannot register errors on any field you'd like using FormError when verifying the form...
Thanks to Tom Myer, Here what I used
class MatchConstraint[A](val targetField:String, val map:(String, Map[String, String]) => A, val unmap:A => String) extends Formatter[A] {
override def bind(key: String, data: Map[String, String]): Either[Seq[FormError], A] = {
val first = data.getOrElse(key, "")
val second = data.getOrElse(targetField, "")
if (first == "" || !first.equals(second)) {
Left(List(FormError(key, "Not Match!")))
}
else {
Right(map(key, data))
}
}
override def unbind(key: String, value: A): Map[String, String] = Map(key -> unmap(value))
}
And here what's my form look like
val registerForm = Form(
mapping(
"email" -> email.verifying(minLength(6)),
"password" -> text(minLength = 6),
"passwordConfirmation" -> of(new MatchConstraint[String]("password", (key, data) => data.getOrElse(key, ""), str => str))
)(RegisterData.apply)(RegisterData.unapply)
)
I guess that they map the scala-code to JSR-Validation. There it's definitely not possible. There are some arguments to do this. Mainly that a validation should be simple and not make complex logic. How ever I still miss this too. OVal from play1 was better for me.
In the documentation:
Playframework Documentation
You can see the following code:
val userFormConstraintsAdHoc = Form(
mapping(
"name" -> text,
"age" -> number
)(UserData.apply)(UserData.unapply) verifying("Failed form constraints!", fields => fields match {
case userData => validate(userData.name, userData.age).isDefined
})
)
Mainly just verify after the unapply and you have all the fields mapped, so you can make a more complete validation.