Parse JSON in PureScript with Argonaut - purescript

I use argonaut library in PureScript for decode and encode JSON.
I cannot write an implementation to decode and encode such json field:
"field": [3, "text"]
Here's an array with different data types.
How can I instance it in argonaut library?

If you have a fixed number of values of different types, this is generally (in computer science and mathematics) called a "tuple", with a special name for when there are just two of them - a "pair".
JavaScript doesn't have a concept of a tuple, and admittedly it would make little sense in the absence of static types. So traditionally tuples in JavaScript are encoded as arrays.
But PureScript does have such concept! In the standard library it's called - surprise! - Tuple (and then there are variants for different number of elements - Tuple3, Tuple4, and so on)
And Argonaut follows the JavaScript convention: it encodes tuples as arrays. So if you just type your field as a Tuple Int String, it will work:
type MyObj = { field :: Tuple Int String }
x :: Either JsonDecodeError MyObj
x = parseJson "{ \"field\": [3, \"text\"] }" >>= decodeJson
main :: Effect Unit
main =
logShow x -- prints Right { field: Tuple 3 "text" }

Related

Idiomatic handling of JSON null in scala upickle / ujson

I am new to Scala and would like to learn the idiomatic way to solve common problems, as in pythonic for Python. My question regards reading JSON data with upickle, where the JSON value contains a string when present, and null when not present. I want to use a custom value to replace null. A simple example:
import upickle.default._
val jsonString = """[{"always": "foo", "sometimes": "bar"}, {"always": "baz", "sometimes": null}]"""
val jsonData = ujson.read(jsonString)
for (m <- jsonData.arr) {
println(m("always").str.length) // this will work
println(m("sometimes").str.length) // this will fail, Exception in thread "main" ujson.Value$InvalidData: Expected ujson.Str (data: null)
}
The issue is with the field "sometimes": when null, we cannot apply .str (or any other function mapping to a static type other than null). I am looking for something like m("sometimes").str("DEFAULT").length, where "DEFAULT" is the replacement for null.
Idea 1
Using pattern matching, the following works:
val sometimes = m("sometimes") match {
case s: ujson.Str => s.str
case _ => "DEFAULT"
}
println(sometimes.length)
Given Scala's concise syntax, this looks a bit complicated and will be repetitive when done for a number of values.
Idea 2
Answers to a related question mention creating a case class with default values. For my problem, the creation of a case class seems inflexible to me when different replacement values are needed depending depending on context.
Idea 3
Anwers to another question (not specific to upickle) discuss using Try().getOrElse(), i.e.:
import scala.util.Try
// ...
println(Try(m("sometimes").str).getOrElse("DEFAULT").length)
However, the discussion mentions that throwing an exception for a regular program path is expensive.
What are idiomatic, yet concise ways to solve this?
Idiomatic or scala way to do this by using scala's Option.
Fortunately, upickle Values offers them. Refer strOpt method in this source code.
Your problem in code is str methods in m("always").str and m("sometimes").str
With this code, you are prematurely assuming that all the values are strings. That's where the strOpt method comes. It either outputs a string if its value is a string or a None type if it not. And we can use getOrElse method coupled with it to decide what to throw if the value is None.
Following would be the optimum way to handle this.
val jsonString = """[{"always": "foo", "sometimes": "bar"}, {"always": "baz", "sometimes": null}]"""
for (m <- jsonData.arr) {
println(m("always").strOpt.getOrElse("").length)
println(m("sometimes").strOpt.getOrElse("").length)
}
Output:
3
3
3
0
Here if we get any value other than a string (null, float, int), the code will output it as an empty string. And its length will be calculated as 0.
Basically, this is similar to your "Idea1" approach but this is the scala way. Instead of "DEFAULT", I am throwing an empty string because you wouldn't want to have null values' length to be 7 (Length of string "DEFAULT").

Modelling a Javascript object in Purescript

I'm trying to model in Purescript the SetOptions data type from Firestore.
Up to now I have the following
foreign import data FieldPath :: Type
foreign import buildFieldPath :: Array String -> FieldPath
foreign import fieldNames :: FieldPath -> Array String
type MergeFields = Array (String \/ FieldPath)
data SetOptions
= MergeOption Merge
| MergeFieldsOption MergeFields
Note that SetOptions is a sum type since the merge and mergeFields are mutually exclusive (even if this is not documented).
Now I need to convert SetOptions into a Javascript object, so that I can pass it to some function from the Javascript firebase library.
It should be something of the form
{
"mergeFields": [
"foo",
new FieldPath("bar", "baz")
]
}
My issue is the type of this.
I can't use Object since the contained data are not homogeneous (merge refers to booleans, mergeFields refers to arrays).
I can't use Json because I need to have FieldPath objects in the result.
The only solution I found up to now is returning some Json and then on the javascript side parse it and add the FieldPath objects where needed, but it looks dirty and brittle.
I would probably skip coproduct SetOption representation on the PS side and just provide these two "dirty" constructors:
foreign import data SetOption :: Type
merge :: Boolean -> SetOption
merge m = unsafeCoerce { merge: m }
mergeFields :: MergeFields -> SetOption
mergeFields mf = unsafeCoerce { mergeFields: mf }
I would probably do the same for MergeFields coproduct.
We are doing something similar in our community project - material ui bindings: purescript-react-basic-mui. Additionally we are grouping these related constructors into records to achieve "cheap namespacing" because we are generating all these bindings from typescript declarations, but this is not really important in this context.
Please take a look at some definitions in this example module:
https://github.com/purescript-react-basic-mui/purescript-react-basic-mui/blob/codegen-read-dts/src/MUI/Core/Badge.purs#L20
EDIT: I think that this latest discourse thread can be a good additional inspiration for you #macrosh.

How to call Scala method with Map Mutable signature?

I have a Scala code:
import collection.mutable._
def myMethod(mycollection: Map[A, B]) = {
...
}
How do I call this method?
Tried this:
myMethod(["test1", "test2"])
Got error:
Identifier expected but 'def' found
Thanks.
A Map is a data structure that maps a key (of some type K) to a value (of some type V). In Scala, such a pair can be denoted by the syntax key -> value. If your intent is to have a single String key "test1" that maps to a String value of "test2", then you can do that as follows:
Map("test1" -> "test2")
Your declaration of myMethod is invalid: you need to either define actual types for A and B or make them generic parameters for your method (so that the method is generic):
// With specific types (assuming both keys and values have String types):
def myMethod(mycollection: Map[String, String]) = //...
// In generic form (allows any key type A, or value type B):
def myMethod[A, B](mycollection: Map[A, B]) = //...
Either way, you can then use the result as the argument in a call to your method as follows:
myMethod(Map("test1" -> "test2"))
Some points to note:
Square brackets are used when defining generic type parameters, or specifying the types used as type parameters.
Type parameters can be inferred from the values supplied. For example Map("test1" -> "test2") uses String as the type for both the key and the value, and is equivalent to Map[String, String]("test1" -> "test2").
If you need more than one key/value pair, list them with a comma separator, for example: Map("key1" -> "value1", "key2" -> "value2", "key3" -> "value3")
I strongly recommend that you read a good book on Scala, such as the excellent Programming in Scala, 3rd Edition by Odersky, Spoon & Venners, in order to become familiar with its syntax and standard library.
As a final point, I would strongly recommend that you use the immutable version of Map whenever possible. If you're not familiar with functional programming principles, this will seem unusual at first, but the benefits are huge.

Pattern matching on List[T] and Set[T] in Scala vs. Haskell: effects of type erasure

Would the Haskell equivalent of the code below produce correct answers?
Can this Scala code be fixed to produce correct answers ? If yes, how ?
object TypeErasurePatternMatchQuestion extends App {
val li=List(1,2,3)
val ls=List("1","2","3")
val si=Set(1,2,3)
val ss=Set("1","2","3")
def whatIsIt(o:Any)=o match{
case o:List[Int] => "List[Int]"
case o:List[String] => "List[String]"
case o:Set[Int] => "Set[Int]"
case o:Set[String] => "Set[String]"
}
println(whatIsIt(li))
println(whatIsIt(ls))
println(whatIsIt(si))
println(whatIsIt(ss))
}
prints:
List[Int]
List[Int]
Set[Int]
Set[Int]
but I would expect it to print:
List[Int]
List[String]
Set[Int]
Set[String]
You must understand that by saying o:Any you erase all the specific information about the type and further on the type Any is all that the compiler knows about value o. That's why from that point on you can only rely on the runtime information about the type.
The case-expressions like case o:List[Int] are resolved using the JVM's special instanceof runtime mechanism. However the buggy behaviour you experience is caused by this mechanism only taking the first-rank type into account (the List in List[Int]) and ignoring the parameters (the Int in List[Int]). That's why it treats List[Int] as equal to List[String]. This issue is known as "Generics Erasure".
Haskell on the other hand performs a complete type erasure, which is well explained in the answer by Ben.
So the problem in both languages is the same: we need to provide a runtime information about the type and its parameters.
In Scala you can achieve that using the "reflection" library, which resolves that information implicitly:
import reflect.runtime.{universe => ru}
def whatIsIt[T](o : T)(implicit t : ru.TypeTag[T]) =
if( t.tpe <:< ru.typeOf[List[Int]] )
"List[Int]"
else if ( t.tpe <:< ru.typeOf[List[String]] )
"List[String]"
else if ( t.tpe <:< ru.typeOf[Set[Int]] )
"Set[Int]"
else if ( t.tpe <:< ru.typeOf[Set[String]] )
"Set[String]"
else sys.error("Unexpected type")
println(whatIsIt(List("1","2","3")))
println(whatIsIt(Set("1","2","3")))
Output:
List[String]
Set[String]
Haskell has a very different approach to polymorphism. Above all, it does not have subtype polymorphism (it's not a weakness though), that's why the type-switching pattern matches as in your example are simply irrelevant. However it is possible to translate the Scala solution from above into Haskell quite closely:
{-# LANGUAGE MultiWayIf, ScopedTypeVariables #-}
import Data.Dynamic
import Data.Set
whatIsIt :: Dynamic -> String
whatIsIt a =
if | Just (_ :: [Int]) <- fromDynamic a -> "[Int]"
| Just (_ :: [String]) <- fromDynamic a -> "[String]"
| Just (_ :: Set Int) <- fromDynamic a -> "Set Int"
| Just (_ :: Set String) <- fromDynamic a -> "Set String"
| otherwise -> error "Unexpected type"
main = do
putStrLn $ whatIsIt $ toDyn ([1, 2, 3] :: [Int])
putStrLn $ whatIsIt $ toDyn (["1", "2", "3"] :: [String])
putStrLn $ whatIsIt $ toDyn (Data.Set.fromList ["1", "2", "3"] :: Set String)
Output:
[Int]
[String]
Set String
However I must outline boldly that this is far from a typical scenario of Haskell programming. The language's type-system is powerful enough to solve extremely intricate problems while maintaining all the type-level information (and safety). Dynamic is only used in very special cases in low-level libraries.
GHC does even more type erasure than the JVM; at runtime the types are completely gone (not just the type parameters).
Haskell's approach to types is to use them at compile time to guarantee that no ill-typed operation can ever be carried out, and since Haskell doesn't have OO-style subtyping and dynamic dispatch, there's no purpose at all to keeping the types around. So data is compiled to a memory structure that simply contains the right values, and functions are compiled with baked-in knowledge of the structure of the types on which they operate1, and just blindly expect their arguments to have that structure. That's why you get fun things like segmentation faults if you mess with unsafeCoerce incorrectly, not just a runtime exception saying the value was not of the expected type; at runtime Haskell has no idea whether a value is of any given type.
So rather than Haskell giving "the right answer" to the equivalent program, Haskell disallows your program as unsafe! There is no Any type in Haskell to which you can cast whatever you want.
That's not 100% true; in both Haskell and Scala there are ways of keeping type information alive at runtime. Essentially it's done by creating ordinary data structures that represent types, and passing them around together values that are of those types, so at runtime you can refer to the type representation object for information about the type of the other object. There are library and language facilities in both languages to let you use this mechanism at a higher (and more principled) level, so that it's easier to use safely. Because it requires the type tokens to be passed around, you have to "opt-in" to such features, and your callers have to be aware of it to pass you the required type tokens (whether the actual generation and passing of the token is done implicitly or explicitly).
Without using such features, Haskell provides no way to pattern match on a value that could be of type List Int or Set String to find out which one it is. Either you're using a monomorphic type, in which case it can only be one type and the others will be rejected, or you're using a polymorphic type, in which case you can only apply code to it that will do the same thing2 regardless of which concrete type instantiates the polymorphic type.
1 Except for polymorphic functions, which assume nothing about their polymorphic arguments, and so can basically do nothing with them except pass them to other polymorphic functions (with matching type class constraints, if any).
2 Type class constrained polymorphic types are the only exception to this. Even then, if you've got a value a type that's a member of some type class, all you can do with it is pass it to other functions that accept values in any type that is a member of that type class. And if those functions are general functions defined outside of the type class in question, they'll be under the same restriction. It's only the type class methods themselves that can actually "do something different" for different types in the class, and that's because they are the union of a whole bunch of monomorphic definitions that operate on one particular type in the class. You can't write code that gets to take a polymorphic value, inspect it to see what it was instantiated with, and then decide what to do.
Of course Haskell prints the right answer:
import Data.Set
import Data.Typeable
main = do
let li=[1,2,3]
let ls=["1","2","3"]
let si=Data.Set.fromList[1,2,3]
let ss=Data.Set.fromList["1","2","3"]
print $ typeOf li
print $ typeOf ls
print $ typeOf si
print $ typeOf ss
prints
[Integer]
[[Char]]
Set Integer
Set [Char]

How do I get an object's type and pass it along to asInstanceOf in Scala?

I have a Scala class that reads formatting information from a JOSN template file, and data from a different file. The goal is to format as a JSON object specified by the template file. I'm getting the layout working, but now I want to set the type of my output to the type in my template (i.e. if I have a field value as a String in the template, it should be a string in the output, even if it's an integer in the raw data).
Basically, I'm looking for a quick and easy way of doing something like:
output = dataValue.asInstanceOf[templateValue.getClass]
That line gives me an error that type getClass is not a member of Any. But I haven't been able to find any other member or method that gives me an variable type at runtime. Is this possible, and if so, how?
Clarification
I should add, by this point in my code, I know I'm dealing with just a key/value pair. What I'd like is the value's type.
Specifically, given the JSON template below, I want the name to be cast to a String, age to be cast to an integer, and salary to be cast a decimal on output regardless of how it appears in the raw data file (it could be all strings, age and salary could both be ints, etc.). What I was hoping for is a simple cast that didn't require me to do pattern matching to handle each data type specifically.
Example template:
people: [{
name: "value",
age: 0,
salary: 0.00
}]
Type parameters must be known at compile time (type symbols), and templateValue.getClass is just a plain value (of type Class), so it cannot be used as type parameter.
What to do instead - this depends on your goal, which isn't yet clear to me... but it may look like
output = someMethod(dataValue, templateValue.getClass),
and inside that method you may do different computations depending on second argument of type Class.
How do I get an object's type and pass it along to asInstanceOf in Scala?
The method scala.reflect.api.JavaUniverse.typeOf[T] requires it's type argument to be hard-coded by the caller or type-inferred. To type-infer, create a utility method like the following (works for all types, even generics - it counteracts java runtime type arg erasure by augmenting T during compilation with type tag metadata):
// http://www.scala-lang.org/api/current/index.html#scala.reflect.runtime.package
import scala.reflect.runtime.universe._
def getType[T: TypeTag](a: T): Type = typeOf[T]
3 requirements here:
type arg implements TypeTag (but previous implementation via Manifest still available...)
one or more input args are typed T
return type is Type (if you want the result to be used externally to the method)
You can invoke without specifying T (it's type-inferred):
import scala.reflect.runtime.universe._
def getType[T: TypeTag](a: T): Type = typeOf[T]
val ls = List[Int](1,2,3)
println(getType(ls)) // prints List[Int]
However, asInstanceOf will only cast the type to a (binary consistent) type in the hierarchy with no conversion of data or format. i.e. the data must already be in the correct binary format - so that won't solve your problem.
Data Conversion
A few methods convert between Integers and Strings:
// defined in scala.Any:
123.toString // gives "123"
// implicitly defined for java.lang.String via scala.collection.immutable.StringOps:
123.toHexString // gives "7b"
123.toOctalString // gives "173"
"%d".format(123) // also gives "123"
"%5d".format(123) // gives " 123"
"%05d".format(123) // gives "00123"
"%01.2f".format(123.456789) // gives "123.46"
"%01.2f".format(123.456789) // gives "0.46"
// implicitly defined for java.lang.String via scala.collection.immutable.StringOps:
" 123".toInt // gives 123
"00123".toInt // gives 123
"00123.4600".toDouble // gives 123.46
".46".toDouble // gives 0.46
Parsing directly from file to target type (no cast or convert):
Unfortunately, scala doesn't have a method to read the next token in a stream as an integer/float/short/boolean/etc. But you can do this by obtaining a java FileInputStream, wrapping it in a DataInputStream and then calling readInt, readFloat, readShort, readBoolean, etc.
In a type-level context the value-level terms still have a few accessors. The first one and the one you asked for is the type of the value itself (type):
output = dataValue.asInstanceOf[templateValue.type]
if the type of the value has inner members, those become available as well:
class A {
class B {}
}
val a: A = new A
val b: a.B = new a.B
Notice b: a.B.
I must also mention how to access such members without a value-level term:
val b: A#B = new a.B