I have the following Input Objects:
val BusinessInputType = InputObjectType[BusinessInput]("BusinessInput", List(
InputField("userId", StringType),
InputField("name", StringType),
InputField("address", OptionInputType(StringType)),
InputField("phonenumber", OptionInputType(StringType)),
InputField("email", OptionInputType(StringType)),
InputField("hours", ListInputType(BusinessHoursInputType))
))
val BusinessHoursInputType = InputObjectType[BusinessHoursInput]("hours", List(
InputField("weekDay", IntType),
InputField("startTime", StringType),
InputField("endTime", StringType)
))
And here are my models with custom Marshalling defined:
case class BusinessInput(userId: String, name: String, address: Option[String], phonenumber: Option[String], email: Option[String], hours: Seq[BusinessHoursInput])
object BusinessInput {
implicit val manual = new FromInput[BusinessInput] {
val marshaller = CoercedScalaResultMarshaller.default
def fromResult(node: marshaller.Node) = {
val ad = node.asInstanceOf[Map[String, Any]]
System.out.println(ad)
BusinessInput(
userId = ad("userId").asInstanceOf[String],
name = ad("name").asInstanceOf[String],
address = ad.get("address").flatMap(_.asInstanceOf[Option[String]]),
phonenumber = ad.get("phonenumber").flatMap(_.asInstanceOf[Option[String]]),
email = ad.get("email").flatMap(_.asInstanceOf[Option[String]]),
hours = ad("hours").asInstanceOf[Seq[BusinessHoursInput]]
)
}
}
}
case class BusinessHoursInput(weekDay: Int, startTime: Time, endTime: Time)
object BusinessHoursInput {
implicit val manual = new FromInput[BusinessHoursInput] {
val marshaller = CoercedScalaResultMarshaller.default
def fromResult(node: marshaller.Node) = {
val ad = node.asInstanceOf[Map[String, Any]]
System.out.println("HEY")
BusinessHoursInput(
weekDay = ad("weekDay").asInstanceOf[Int],
startTime = Time.valueOf(ad("startTime").asInstanceOf[String]),
endTime = Time.valueOf(ad("endTime").asInstanceOf[String])
)
}
}
}
My question is, When I have a nested InputObject that has custom Marshalling, I dont see the marshalling of BusinessHoursInput getting invoked before the BusinessInput is marshalled. I noticed this because the print statement of "Hey" is never executed before the print statement of "ad" in BusinessInput. This causes problems later down the road for me when I try to insert the hours field of BusinessInput in the DB because it cannot cast it to BusinessHoursInput object. In Sangria, is it not possible to custom Marshal nested Objects before the parent Object is marshalled?
You are probably are using BusinessInput as an argument type. The actual implicit lookup takes place at the Argument definition time and only for BusinessInput type.
Since FromInput is a type-class based deserialization, you need to explicitly define the dependency between deserializers of nested object. For example, you can rewrite the deserializer like this:
case class BusinessInput(userId: String, name: String, address: Option[String], phonenumber: Option[String], email: Option[String], hours: Seq[BusinessHoursInput])
object BusinessInput {
implicit def manual(implicit hoursFromInput: FromInput[BusinessHoursInput]) = new FromInput[BusinessInput] {
val marshaller = CoercedScalaResultMarshaller.default
def fromResult(node: marshaller.Node) = {
val ad = node.asInstanceOf[Map[String, Any]]
BusinessInput(
userId = ad("userId").asInstanceOf[String],
name = ad("name").asInstanceOf[String],
address = ad.get("address").flatMap(_.asInstanceOf[Option[String]]),
phonenumber = ad.get("phonenumber").flatMap(_.asInstanceOf[Option[String]]),
email = ad.get("email").flatMap(_.asInstanceOf[Option[String]]),
hours = hoursFromInput.fromResult(ad("hours").asInstanceOf[Seq[hoursFromInput.marshaller.Node]])
)
}
}
}
In this version, I'm taking advantage of existing FromInput[BusinessHoursInput] to deserialize BusinessHoursInput from the raw input.
Also as an alternative, you can avoid defining manual FromInput deserializers altogether by taking advantage of existing JSON-based deserializers. For example, in most cases, circe's automatic derivation works just fine. You just need these 2 imports (in the file where you are defining the arguments):
import sangria.marshalling.circe._
import io.circe.generic.auto._
Those import put appropriate FromInput instances into the scope. These instances take advantage of circe's own deserialization mechanism.
import io.circe.Decoder
import io.circe.generic.semiauto.deriveDecoder
import sangria.macros.derive.deriveInputObjectType
import sangria.marshalling.circe._
import sangria.schema.{Argument, InputObjectType}
object XXX {
// when you have FromInput for all types in case class (Int, String) you can derive it
case class BusinessHoursInput(weekDay: Int, startTime: String, endTime: String)
object BusinessHoursInput {
implicit val decoder: Decoder[BusinessHoursInput] = deriveDecoder
implicit val inputType: InputObjectType[BusinessHoursInput] = deriveInputObjectType[BusinessHoursInput]()
}
// the same here, you need InputObjectType also for BusinessHoursInput
case class BusinessInput(userId: String, name: String, address: Option[String], phonenumber: Option[String], email: Option[String], hours: Seq[BusinessHoursInput])
object BusinessInput {
implicit val decoder: Decoder[BusinessInput] = deriveDecoder
implicit val inputType: InputObjectType[BusinessInput] = deriveInputObjectType[BusinessInput]()
}
// for this to work you need to have in scope InputType BusinessInput and FromInput for BusinessInput
// FromInput you can get by having Decoder in scope and import sangria.marshalling.circe._
private val businessInputArg = Argument("businessInput", BusinessInput.inputType)
}
id you do not use circe but different json library you should have of course different typeclasses and proper import in scope
Related
New to Scala and Play and running into a problem where I have the following:
case class Media(
name: String,
id: Id,
type: String,
properties: PodcastProperties
)
object Media {
implicit val format: OFormat[Media] = Json.format[Media]
case class PodcastProperties(
x: Int,
y: DateTime,
z: String
)
object PodcastProperties {
implicit val format: OFormat[PodcastProperties] = Json.format[PodcastProperties]
Say I want to define Media to accept different media types. Let's say I have a Json Media object, and it's type is "newspaper" and it's properties should be parse using "NewspaperProperties"
case class NewspaperProperties(
Title: String,
Publisher: String
)
object NewspaperProperties {
implicit val format: OFormat[NewspaperProperties] = Json.format[NewspaperProperties]
How can I define Media, so it can parse the "type" field, and then read the "properties" field correctly using the right Json parser?
You need to defined the media properties as sealed family.
import play.api.libs.json._
import java.time.OffsetDateTime
sealed trait MediaProperties
case class NewspaperProperties(
title: String, // Do not use 'Title' .. initial cap is not valid
publisher: String // ... not 'Publisher'
) extends MediaProperties
object NewspaperProperties {
implicit val format: OFormat[NewspaperProperties] = Json.format[NewspaperProperties]
}
case class PodcastProperties(
x: Int,
y: OffsetDateTime,
z: String
) extends MediaProperties
object PodcastProperties {
implicit val format: OFormat[PodcastProperties] =
Json.format[PodcastProperties]
}
Then a OFormat can be materialized for MediaProperties.
implicit val mediaPropertiesFormat: OFormat[MediaProperties] = Json.format
This managed discriminator in the JSON representation (by default _type field, naming can be configured).
val props1: MediaProperties = PodcastProperties(1, OffsetDateTime.now(), "z")
val obj1 = Json.toJson(props1)
// > JsValue = {"_type":"PodcastProperties","x":1,"y":"2020-11-23T22:53:35.301603+01:00","z":"z"}
obj1.validate[MediaProperties]
JsResult[MediaProperties] = JsSuccess(PodcastProperties(1,2020-11-23T23:02:24.752063+01:00,z),)
The implicit format for MediaProperties should probably be defined in the companion object MediaProperties.
Then the format for Media can be materialized automatically.
final class Id(val value: String) extends AnyVal
object Id {
implicit val format: Format[Id] = Json.valueFormat
}
case class Media(
name: String,
id: Id,
//type: String, -- Not needed for the JSON representation
properties: MediaProperties
)
object Media {
implicit val format: OFormat[Media] = Json.format[Media] // <--- HERE
}
I'm implementing a client for the Keystone API of Openstack. For the Users I have the following classes:
import java.time.OffsetDateTime
import io.circe.derivation.{deriveDecoder, deriveEncoder, renaming}
import io.circe.{Decoder, Encoder}
object User {
object Update {
implicit val encoder: Encoder[Update] = deriveEncoder(renaming.snakeCase)
}
case class Update(
name: Option[String] = None,
password: Option[String] = None,
defaultProjectId: Option[String] = None,
enabled: Option[Boolean] = None,
)
implicit val decoder: Decoder[User] = deriveDecoder(renaming.snakeCase)
}
final case class User(
id: String,
name: String,
domainId: String,
defaultProjectId: Option[String] = None,
passwordExpiresAt: Option[OffsetDateTime] = None,
enabled: Boolean = true,
)
Where User.Update contains the possible fields to update a user. Updates are done using PATCH, or in other words they are partial. The encoders are being used in a class which has the methods to create, update, delete, get, and list the users. This service class uses the encoders in a http4s EntityEncoder with:
import io.circe.{Encoder, Printer}
import org.http4s.{EntityDecoder, circe}
val jsonPrinter: Printer = Printer.noSpaces.copy(dropNullValues = true)
implicit def jsonEncoder[A: Encoder]: EntityEncoder[F, A] = circe.jsonEncoderWithPrinterOf(jsonPrinter)
My problem is how to implement the update for defaultProjectId. In the final json sent to the server the following cases are possible:
Keep the current value of defaultProjectId (the json object does not contain the field default_project_id:
{
(...)
}
Change the defaultProjectId to an-id:
{
(...),
"default_project_id: "an-id",
(...)
}
Unset the defaultProjectId:
{
(...),
"default_project_id: null,
(...)
}
The current implementation: defaultProjectId: Option[String] = None + dropNullValues in the printer, models correctly the cases 1 and 2, but prevents case 3.
Ideally I would have an ADT like:
sealed trait Updatable[+T]
case object KeepExistingValue extends Updatable[Nothing]
case object Unset extends Updatable[Nothing]
case class ChangeTo[T](value: T) extends Updatable[T]
Usage example (probably in the future all fields would be Updatables):
case class Update(
name: Option[String] = None,
password: Option[String] = None,
defaultProjectId: Updatable[String] = KeepExistingValue,
enabled: Option[Boolean] = None,
)
But I can't find a clean way to encode this ADT. Attempted solutions and their problems (they all require not using the printer with dropNullValues in the update method):
Unset is special:
// The generic implementation of Updatable
implicit def updatableEncoder[T](implicit valueEncoder: Encoder[T]): Encoder[Updatable[T]] = {
case KeepExistingValue => Json.Null
case Unset => Json.fromString(Unset.getClass.getName) // Or another arbitrary value
case ChangeTo(value) => valueEncoder(value)
}
// In the service class
def nullifyUnsets(obj: JsonObject): JsonObject = obj.mapValues {
case json if json.asString.contains(Unset.getClass.getName) => Json.Null
case json => json
}
def update(id: String, update: Update): F[Model] = {
// updateEncoder is of type Encoder[Update]
updateEncoder(update).dropNullValues.mapObject(nullifyUnsets)
(...)
}
Pros:
Using the dropNullValues nicely handles the KeepExistingValue case.
If the user invokes dropNullValues to derive the encoder the code still works.
Cons:
Because of dropNullValues the Unset case is meh.
We iterate twice on the Json Object field/values, once for dropNullValues another for mapObject.
Json.fromString(Unset.getClass.getName) is arbitrary and can collide with a legit value for T, although very unlikely.
KeepExistingValue is special:
// The generic implementation of Updatable
implicit def updatableEncoder[T](implicit valueEncoder: Encoder[T]): Encoder[Updatable[T]] = {
case KeepExistingValue => Json.fromString(Unset.getClass.getName) // Or another arbitrary value
case Unset => Json.Null
case ChangeTo(value) => valueEncoder(value)
}
// In the service class
def dropKeepExistingValues(obj: JsonObject): JsonObject = obj.filter{
case (_, json) => !json.asString.contains(Unset.getClass.getName)
}
def update(id: String, update: Update): F[Model] = {
// updateEncoder is of type Encoder[Update]
updateEncoder(update).mapObject(dropKeepExistingValues)
(...)
}
Pros:
Simpler implementation, updatableEncoder implementation maps more directly to the needed Json.
Just one pass over the Json Object field/values.
Cons:
If the programmer invokes dropNullValues to derive the encoder then the code stops working.
Json.fromString(Unset.getClass.getName) is arbitrary and can collide with a legit value for T, although very unlikely.
I'm sure I'm not the first one to hit this problem, but I can't search for it, the best I got is this comment.
For one of my routes I have an optional parameter i.e. birthDate: Option[String] and can do this:
GET /rest/api/findSomeone/:firstName/:lastName controllers.PeopleController.findSomeone(firstName: String, lastName: String, birthDate: Option[String])
However, to be more strict with the birthDate optional parameter it would be helpful to specify a regex like this:
$birthDate<([12]\d{3}-(0[1-9]|1[0-2])-(0[1-9]|[12]\d|3[01]))>
But since this is an optional parameter I can't find a way to do that .. it this covered in Play 2.7.x? I'm faced with the dilemma of making the birthDate parameter non-optional or leaving it unchecked.
As a side note. I had been trying to integrate routes binding of Joda time e.g. org.joda.time.LocalDate by adding the following dependency https://github.com/tototoshi/play-joda-routes-binder "com.github.tototoshi" %% "play-joda-routes-binder" % "1.3.0" but it didn't work in my project as I get compilation errors after integrating it so I stashed that approach away for the time being.
For parsing a date, I wouldn't recommend using a regex based validator at all. Instead, you could - for instance - use a custom case class with a query string binder which will do a type-safe parsing of the incoming parameter:
package models
import java.time.LocalDate
import java.time.format.{DateTimeFormatter, DateTimeParseException}
import play.api.mvc.QueryStringBindable
case class BirthDate(date: LocalDate)
object BirthDate {
private val dateFormatter: DateTimeFormatter = DateTimeFormatter.ISO_DATE // or whatever date format you're using
implicit val queryStringBindable = new QueryStringBindable[BirthDate] {
override def bind(key: String, params: Map[String, Seq[String]]): Option[Either[String, BirthDate]] = {
params.get(key).flatMap(_.headOption).map { value =>
try {
Right(BirthDate(LocalDate.parse(value, dateFormatter)))
} catch {
case _: DateTimeParseException => Left(s"$value cannot be parsed as a date!")
}
}
}
override def unbind(key: String, value: BirthDate): String = {
s"$key=${value.date.format(dateFormatter)}"
}
}
}
Now if you change your routes config so birthDate is a parameter of type Option[BirthDate], you'll get the behaviour you want.
If you're insistent on using regexes, you could use a regex-based parser in place of the date formatter and have BirthDate wrap a String instead of a LocalDate, but for the use case presented I really don't see what the advantage of that would be.
EDIT: just for completeness, the regex-based variant:
case class BirthDate(date: String)
object BirthDate {
private val regex = "([12]\\d{3}-(0[1-9]|1[0-2])-(0[1-9]|[12]\\d|3[01]))".r
implicit val queryStringBindable = new QueryStringBindable[BirthDate] {
override def bind(key: String, params: Map[String, Seq[String]]): Option[Either[String, BirthDate]] = {
params.get(key).flatMap(_.headOption).map { value =>
regex.findFirstIn(value).map(BirthDate.apply).toRight(s"$value cannot be parsed as a date!")
}
}
override def unbind(key: String, value: BirthDate): String = {
s"$key=${value.date}"
}
}
}
Is it possible to add functionality before calling constructor in extra constructor in scala ?
Lets say, I have class User, and want to get one string - and to split it into attributes - to send them to the constructor:
class User(val name: String, val age: Int){
def this(line: String) = {
val attrs = line.split(",") //This line is leading an error - what can I do instead
this(attrs(0), attrs(1).toInt)
}
}
So I know I'm not able to add a line before sending to this, because all constructors need to call another constructor as the first statement of the constructor.
Then what can I do instead?
Edit:
I have a long list of attributes, so I don't want to repeat line.split(",")
I think this is a place where companion object and apply() method come nicely into play:
object User {
def apply(line: String): User = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
class User(val name: String, val age: Int)
Then you just create your object the following way:
val u1 = User("Zorro,33")
Also since you're exposing name and age anyway, you might consider using case class instead of standard class and have consistent way of constructing User objects (without new keyword):
object User {
def apply(line: String): User = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
case class User(name: String, age: Int)
val u1 = User("Zorro,33")
val u2 = User("Zorro", "33")
Ugly, but working solution#1:
class User(val name: String, val age: Int){
def this(line: String) = {
this(line.split(",")(0), line.split(",")(1).toInt)
}
}
Ugly, but working solution#2:
class User(val name: String, val age: Int)
object User {
def fromString(line: String) = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
Which can be used as:
val johny = User.fromString("johny,35")
You could use apply in place of fromString, but this will lead to a confusion (in one case you have to use new, in the other you have to drop it) so I prefer to use different name
Another ugly solution:
class User(line: String) {
def this(name: String, age: Int) = this(s"$name,$age")
val (name, age) = {
val Array(nameStr,ageStr) = line.split(",")
(nameStr,ageStr.toInt)
}
}
But using a method of the companion object is probably better.
in order to implement a ReSTfull APIs stack, I need to convert data extracted from a DB to JSON format. I think that the best way is to extract data from the DB and then convert the row set to JSON using Json.toJson() passing as argument a case class after having defined a implicit serializer (writes).
Here's my case class and companion object:
package deals.db.interf.slick2
import scala.slick.driver.MySQLDriver.simple._
import play.api.libs.json.Json
case class PartnerInfo(
id: Int,
name: String,
site: String,
largeLogo: String,
smallLogo: String,
publicationSite: String
)
object PartnerInfo {
def toCaseClass( ?? ) = { // what type are the arguments to be passed?
PartnerInfo( fx(??) ) // how to transform the input types (slick) to Scala types?
}
// Notice I'm using slick 2.0.0 RC1
class PartnerInfoTable(tag: Tag) extends Table[(Int, String, String, String, String, String)](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield toCaseClass( _ ) // Pass all the table columns to toCaseClass()
}
implicit val partnerInfoWrites = Json.writes[PartnerInfo]
}
What I cannot get is how to implement the toCaseClass() method in order to transform the column types from Slick 2 to Scala types - notice the function fx() in the body of toCaseClass() is only meant to give emphasis to that.
I'm wondering if is it possible to get the Scala type from Slick column type because it is clearly passed in the table definition, but I cannot find how to get it.
Any idea?
I believe the simplest method here would be to map PartnerInfo in the table schema:
class PartnerInfoTable(tag: Tag) extends Table[PartnerInfo](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite) <> (PartnerInfo.tupled, PartnerInfo.unapply)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield pi
}
Otherwise PartnerInfo.tupled should do the trick:
def toCaseClass(pi:(Int, String, String, String, String, String)) = PartnerInfo.tupled(pi)